Nov 22 02:53:54 crc systemd[1]: Starting Kubernetes Kubelet... Nov 22 02:53:54 crc restorecon[4770]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:54 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:53:55 crc restorecon[4770]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:53:55 crc restorecon[4770]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 22 02:53:56 crc kubenswrapper[4952]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 02:53:56 crc kubenswrapper[4952]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 22 02:53:56 crc kubenswrapper[4952]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 02:53:56 crc kubenswrapper[4952]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 02:53:56 crc kubenswrapper[4952]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 22 02:53:56 crc kubenswrapper[4952]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.227961 4952 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.236847 4952 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.236890 4952 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.236906 4952 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.236920 4952 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.236931 4952 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.236940 4952 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.236949 4952 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.236957 4952 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.236966 4952 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.236975 4952 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.236983 4952 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.236991 4952 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.236999 4952 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237008 4952 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237017 4952 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237042 4952 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237051 4952 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237059 4952 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237068 4952 feature_gate.go:330] unrecognized feature gate: Example Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237078 4952 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237086 4952 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237094 4952 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237102 4952 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237111 4952 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237122 4952 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237131 4952 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237140 4952 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237151 4952 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237161 4952 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237174 4952 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237185 4952 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237194 4952 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237204 4952 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237212 4952 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237221 4952 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237230 4952 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237238 4952 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237247 4952 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237255 4952 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237263 4952 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237276 4952 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237285 4952 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237294 4952 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237302 4952 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237311 4952 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237324 4952 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237334 4952 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237343 4952 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237352 4952 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237360 4952 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237369 4952 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237378 4952 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237386 4952 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237395 4952 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237403 4952 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237412 4952 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237420 4952 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237428 4952 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237437 4952 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237445 4952 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237453 4952 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237461 4952 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237469 4952 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237477 4952 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237486 4952 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237494 4952 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237504 4952 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237514 4952 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237526 4952 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237536 4952 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.237585 4952 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.237754 4952 flags.go:64] FLAG: --address="0.0.0.0" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.237774 4952 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.237790 4952 flags.go:64] FLAG: --anonymous-auth="true" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.237803 4952 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.237815 4952 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.237826 4952 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.237838 4952 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.237850 4952 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.237860 4952 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.237870 4952 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.237880 4952 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.237891 4952 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.237901 4952 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.237912 4952 flags.go:64] FLAG: --cgroup-root="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.237923 4952 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.237934 4952 flags.go:64] FLAG: --client-ca-file="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.237947 4952 flags.go:64] FLAG: --cloud-config="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.237959 4952 flags.go:64] FLAG: --cloud-provider="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.237973 4952 flags.go:64] FLAG: --cluster-dns="[]" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.237989 4952 flags.go:64] FLAG: --cluster-domain="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238000 4952 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238014 4952 flags.go:64] FLAG: --config-dir="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238027 4952 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238040 4952 flags.go:64] FLAG: --container-log-max-files="5" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238054 4952 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238065 4952 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238079 4952 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238091 4952 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238103 4952 flags.go:64] FLAG: --contention-profiling="false" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238116 4952 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238127 4952 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238140 4952 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238154 4952 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238169 4952 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238180 4952 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238191 4952 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238202 4952 flags.go:64] FLAG: --enable-load-reader="false" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238214 4952 flags.go:64] FLAG: --enable-server="true" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238225 4952 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238239 4952 flags.go:64] FLAG: --event-burst="100" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238250 4952 flags.go:64] FLAG: --event-qps="50" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238262 4952 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238274 4952 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238286 4952 flags.go:64] FLAG: --eviction-hard="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238302 4952 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238314 4952 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238328 4952 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238338 4952 flags.go:64] FLAG: --eviction-soft="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238346 4952 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238355 4952 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238364 4952 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238372 4952 flags.go:64] FLAG: --experimental-mounter-path="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238381 4952 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238391 4952 flags.go:64] FLAG: --fail-swap-on="true" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238400 4952 flags.go:64] FLAG: --feature-gates="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238410 4952 flags.go:64] FLAG: --file-check-frequency="20s" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238418 4952 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238428 4952 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238437 4952 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238446 4952 flags.go:64] FLAG: --healthz-port="10248" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238456 4952 flags.go:64] FLAG: --help="false" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238464 4952 flags.go:64] FLAG: --hostname-override="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238473 4952 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238482 4952 flags.go:64] FLAG: --http-check-frequency="20s" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238491 4952 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238500 4952 flags.go:64] FLAG: --image-credential-provider-config="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238508 4952 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238517 4952 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238586 4952 flags.go:64] FLAG: --image-service-endpoint="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238598 4952 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238607 4952 flags.go:64] FLAG: --kube-api-burst="100" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238615 4952 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238626 4952 flags.go:64] FLAG: --kube-api-qps="50" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238636 4952 flags.go:64] FLAG: --kube-reserved="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238648 4952 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238659 4952 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238673 4952 flags.go:64] FLAG: --kubelet-cgroups="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238684 4952 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238694 4952 flags.go:64] FLAG: --lock-file="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238707 4952 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238720 4952 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238730 4952 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238743 4952 flags.go:64] FLAG: --log-json-split-stream="false" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238752 4952 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238761 4952 flags.go:64] FLAG: --log-text-split-stream="false" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238770 4952 flags.go:64] FLAG: --logging-format="text" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238778 4952 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238788 4952 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238796 4952 flags.go:64] FLAG: --manifest-url="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238805 4952 flags.go:64] FLAG: --manifest-url-header="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238817 4952 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238826 4952 flags.go:64] FLAG: --max-open-files="1000000" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238837 4952 flags.go:64] FLAG: --max-pods="110" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238846 4952 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238855 4952 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238864 4952 flags.go:64] FLAG: --memory-manager-policy="None" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238873 4952 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238882 4952 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238891 4952 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238899 4952 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238920 4952 flags.go:64] FLAG: --node-status-max-images="50" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238928 4952 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238937 4952 flags.go:64] FLAG: --oom-score-adj="-999" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238946 4952 flags.go:64] FLAG: --pod-cidr="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238968 4952 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238989 4952 flags.go:64] FLAG: --pod-manifest-path="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.238998 4952 flags.go:64] FLAG: --pod-max-pids="-1" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239007 4952 flags.go:64] FLAG: --pods-per-core="0" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239015 4952 flags.go:64] FLAG: --port="10250" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239025 4952 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239034 4952 flags.go:64] FLAG: --provider-id="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239044 4952 flags.go:64] FLAG: --qos-reserved="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239054 4952 flags.go:64] FLAG: --read-only-port="10255" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239063 4952 flags.go:64] FLAG: --register-node="true" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239072 4952 flags.go:64] FLAG: --register-schedulable="true" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239081 4952 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239097 4952 flags.go:64] FLAG: --registry-burst="10" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239106 4952 flags.go:64] FLAG: --registry-qps="5" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239115 4952 flags.go:64] FLAG: --reserved-cpus="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239123 4952 flags.go:64] FLAG: --reserved-memory="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239135 4952 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239144 4952 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239152 4952 flags.go:64] FLAG: --rotate-certificates="false" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239161 4952 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239169 4952 flags.go:64] FLAG: --runonce="false" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239179 4952 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239187 4952 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239196 4952 flags.go:64] FLAG: --seccomp-default="false" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239206 4952 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239214 4952 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239223 4952 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239232 4952 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239241 4952 flags.go:64] FLAG: --storage-driver-password="root" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239249 4952 flags.go:64] FLAG: --storage-driver-secure="false" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239258 4952 flags.go:64] FLAG: --storage-driver-table="stats" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239267 4952 flags.go:64] FLAG: --storage-driver-user="root" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239275 4952 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239284 4952 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239294 4952 flags.go:64] FLAG: --system-cgroups="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239302 4952 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239327 4952 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239337 4952 flags.go:64] FLAG: --tls-cert-file="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239346 4952 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239363 4952 flags.go:64] FLAG: --tls-min-version="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239372 4952 flags.go:64] FLAG: --tls-private-key-file="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239381 4952 flags.go:64] FLAG: --topology-manager-policy="none" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239390 4952 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239399 4952 flags.go:64] FLAG: --topology-manager-scope="container" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239407 4952 flags.go:64] FLAG: --v="2" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239418 4952 flags.go:64] FLAG: --version="false" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239430 4952 flags.go:64] FLAG: --vmodule="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239441 4952 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.239450 4952 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239716 4952 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239728 4952 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239737 4952 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239745 4952 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239754 4952 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239761 4952 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239770 4952 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239783 4952 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239793 4952 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239802 4952 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239811 4952 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239819 4952 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239827 4952 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239836 4952 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239844 4952 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239852 4952 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239861 4952 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239869 4952 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239877 4952 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239886 4952 feature_gate.go:330] unrecognized feature gate: Example Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239895 4952 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239903 4952 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239911 4952 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239946 4952 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239955 4952 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239963 4952 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239973 4952 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.239983 4952 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240004 4952 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240015 4952 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240024 4952 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240035 4952 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240045 4952 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240054 4952 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240063 4952 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240072 4952 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240082 4952 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240133 4952 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240144 4952 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240153 4952 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240161 4952 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240169 4952 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240178 4952 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240187 4952 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240195 4952 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240204 4952 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240212 4952 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240220 4952 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240228 4952 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240236 4952 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240244 4952 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240252 4952 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240268 4952 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240276 4952 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240284 4952 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240292 4952 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240300 4952 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240307 4952 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240315 4952 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240332 4952 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240340 4952 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240348 4952 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240356 4952 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240364 4952 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240372 4952 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240379 4952 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240388 4952 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240395 4952 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240403 4952 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240411 4952 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.240419 4952 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.240444 4952 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.253445 4952 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.253494 4952 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253629 4952 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253644 4952 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253657 4952 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253673 4952 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253684 4952 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253692 4952 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253701 4952 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253710 4952 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253721 4952 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253731 4952 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253740 4952 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253748 4952 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253756 4952 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253764 4952 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253772 4952 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253784 4952 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253796 4952 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253805 4952 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253815 4952 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253824 4952 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253835 4952 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253845 4952 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253855 4952 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253864 4952 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253873 4952 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253881 4952 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253890 4952 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253898 4952 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253906 4952 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253914 4952 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253923 4952 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253931 4952 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253939 4952 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253947 4952 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253957 4952 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253965 4952 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253973 4952 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253981 4952 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253989 4952 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.253997 4952 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254006 4952 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254013 4952 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254023 4952 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254033 4952 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254043 4952 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254054 4952 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254063 4952 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254074 4952 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254085 4952 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254096 4952 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254110 4952 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254120 4952 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254129 4952 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254139 4952 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254149 4952 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254159 4952 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254168 4952 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254177 4952 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254186 4952 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254196 4952 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254206 4952 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254216 4952 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254226 4952 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254236 4952 feature_gate.go:330] unrecognized feature gate: Example Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254246 4952 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254256 4952 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254265 4952 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254275 4952 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254286 4952 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254296 4952 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254306 4952 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.254320 4952 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254673 4952 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254697 4952 feature_gate.go:330] unrecognized feature gate: Example Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254710 4952 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254722 4952 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254734 4952 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254744 4952 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254754 4952 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254764 4952 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254774 4952 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254784 4952 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254794 4952 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254804 4952 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254815 4952 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254825 4952 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254834 4952 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254844 4952 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254856 4952 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254872 4952 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254883 4952 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254895 4952 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254906 4952 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254916 4952 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254926 4952 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254935 4952 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254945 4952 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254955 4952 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254965 4952 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254975 4952 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.254989 4952 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255001 4952 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255013 4952 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255024 4952 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255033 4952 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255043 4952 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255055 4952 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255066 4952 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255076 4952 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255086 4952 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255095 4952 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255104 4952 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255115 4952 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255125 4952 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255135 4952 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255145 4952 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255155 4952 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255165 4952 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255175 4952 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255185 4952 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255194 4952 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255204 4952 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255214 4952 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255227 4952 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255239 4952 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255250 4952 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255260 4952 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255271 4952 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255281 4952 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255291 4952 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255301 4952 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255311 4952 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255321 4952 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255331 4952 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255340 4952 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255350 4952 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255359 4952 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255370 4952 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255380 4952 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255390 4952 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255403 4952 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255415 4952 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.255427 4952 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.255442 4952 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.256687 4952 server.go:940] "Client rotation is on, will bootstrap in background" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.262537 4952 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.262684 4952 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.264755 4952 server.go:997] "Starting client certificate rotation" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.264803 4952 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.264994 4952 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-09 14:38:31.991639425 +0000 UTC Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.265123 4952 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1163h44m35.72652065s for next certificate rotation Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.299991 4952 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.307424 4952 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.329210 4952 log.go:25] "Validated CRI v1 runtime API" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.371658 4952 log.go:25] "Validated CRI v1 image API" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.373915 4952 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.381644 4952 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-22-02-49-01-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.381695 4952 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.410714 4952 manager.go:217] Machine: {Timestamp:2025-11-22 02:53:56.406957177 +0000 UTC m=+0.712974510 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:2d0f1a1c-2ee1-4b37-849e-8151c669da05 BootID:77d94e3a-20e5-45ab-9435-01440651fcdb Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:93:b5:c1 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:93:b5:c1 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:0c:7f:d3 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:39:c9:12 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:b8:d3:41 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:3c:67:c0 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:8a:cb:e9 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:3e:13:87:c5:09:b3 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:8a:bb:2d:9a:c5:a3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.411131 4952 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.411296 4952 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.412805 4952 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.413116 4952 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.413168 4952 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.413513 4952 topology_manager.go:138] "Creating topology manager with none policy" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.413530 4952 container_manager_linux.go:303] "Creating device plugin manager" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.414232 4952 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.414280 4952 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.415139 4952 state_mem.go:36] "Initialized new in-memory state store" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.415265 4952 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.423229 4952 kubelet.go:418] "Attempting to sync node with API server" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.423277 4952 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.423326 4952 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.423352 4952 kubelet.go:324] "Adding apiserver pod source" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.423381 4952 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.428072 4952 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.429347 4952 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.435505 4952 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.436598 4952 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.436639 4952 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Nov 22 02:53:56 crc kubenswrapper[4952]: E1122 02:53:56.437810 4952 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.144:6443: connect: connection refused" logger="UnhandledError" Nov 22 02:53:56 crc kubenswrapper[4952]: E1122 02:53:56.437798 4952 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.144:6443: connect: connection refused" logger="UnhandledError" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.438339 4952 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.438401 4952 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.438422 4952 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.438444 4952 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.438473 4952 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.438493 4952 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.438512 4952 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.438574 4952 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.438595 4952 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.438616 4952 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.438643 4952 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.438661 4952 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.439676 4952 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.440808 4952 server.go:1280] "Started kubelet" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.441525 4952 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.441801 4952 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.441876 4952 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.442788 4952 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 22 02:53:56 crc systemd[1]: Started Kubernetes Kubelet. Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.445579 4952 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.445670 4952 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.445857 4952 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 07:04:40.897826157 +0000 UTC Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.445921 4952 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1228h10m44.451912359s for next certificate rotation Nov 22 02:53:56 crc kubenswrapper[4952]: E1122 02:53:56.446086 4952 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.446239 4952 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.446265 4952 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.446338 4952 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.447350 4952 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Nov 22 02:53:56 crc kubenswrapper[4952]: E1122 02:53:56.447463 4952 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.144:6443: connect: connection refused" logger="UnhandledError" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.447972 4952 server.go:460] "Adding debug handlers to kubelet server" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.448076 4952 factory.go:55] Registering systemd factory Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.448113 4952 factory.go:221] Registration of the systemd container factory successfully Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.448721 4952 factory.go:153] Registering CRI-O factory Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.448761 4952 factory.go:221] Registration of the crio container factory successfully Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.448907 4952 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.448946 4952 factory.go:103] Registering Raw factory Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.448974 4952 manager.go:1196] Started watching for new ooms in manager Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.450261 4952 manager.go:319] Starting recovery of all containers Nov 22 02:53:56 crc kubenswrapper[4952]: E1122 02:53:56.452102 4952 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.144:6443: connect: connection refused" interval="200ms" Nov 22 02:53:56 crc kubenswrapper[4952]: E1122 02:53:56.461723 4952 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.144:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187a349925f9fe64 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-22 02:53:56.440751716 +0000 UTC m=+0.746769029,LastTimestamp:2025-11-22 02:53:56.440751716 +0000 UTC m=+0.746769029,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472265 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472350 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472372 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472392 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472413 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472431 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472451 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472470 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472493 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472524 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472575 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472605 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472625 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472648 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472670 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472689 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472708 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472792 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472811 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472831 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472848 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472868 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472885 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472907 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472929 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472950 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.472977 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.473003 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.473026 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.473049 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.473070 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.473094 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.473116 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.473138 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.473165 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.473196 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.473231 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.473263 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.473294 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.473324 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.479075 4952 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.479669 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.479715 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.479745 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.479771 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.479798 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.479829 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.479853 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.479882 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.479923 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.479953 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480004 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480031 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480073 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480105 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480136 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480165 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480197 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480223 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480247 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480274 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480306 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480333 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480359 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480386 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480411 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480440 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480465 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480491 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480517 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480585 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480615 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480655 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480688 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480713 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480738 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480764 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480792 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480817 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480845 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480876 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480902 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480928 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480955 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.480981 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481003 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481022 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481044 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481063 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481087 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481106 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481126 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481146 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481166 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481186 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481204 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481222 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481245 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481266 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481284 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481305 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481323 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481345 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481366 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481386 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481415 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481438 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481461 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481482 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481503 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481526 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481580 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481606 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481627 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481653 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481674 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481693 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481714 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481732 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481752 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481771 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481791 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481809 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481826 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481847 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481867 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481884 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481902 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481922 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481940 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481959 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481978 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.481998 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482017 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482042 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482071 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482101 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482127 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482151 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482176 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482201 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482223 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482245 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482264 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482289 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482309 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482328 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482345 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482365 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482389 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482408 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482427 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482460 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482487 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482608 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482635 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482700 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482721 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482741 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482793 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482816 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482843 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482916 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.482945 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483018 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483037 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483093 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483122 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483142 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483202 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483227 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483305 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483332 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483405 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483482 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483514 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483535 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483599 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483622 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483640 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483696 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483718 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483769 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483791 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483812 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483865 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483888 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483908 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.483960 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484077 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484100 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484151 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484170 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484189 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484240 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484260 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484280 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484334 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484357 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484376 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484439 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484468 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484529 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484588 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484609 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484628 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484681 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484699 4952 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484810 4952 reconstruct.go:97] "Volume reconstruction finished" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.484824 4952 reconciler.go:26] "Reconciler: start to sync state" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.486473 4952 manager.go:324] Recovery completed Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.502707 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.504653 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.504707 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.504723 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.505661 4952 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.505696 4952 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.505724 4952 state_mem.go:36] "Initialized new in-memory state store" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.527000 4952 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.529437 4952 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.529782 4952 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.529849 4952 kubelet.go:2335] "Starting kubelet main sync loop" Nov 22 02:53:56 crc kubenswrapper[4952]: E1122 02:53:56.529916 4952 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.530686 4952 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Nov 22 02:53:56 crc kubenswrapper[4952]: E1122 02:53:56.530761 4952 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.144:6443: connect: connection refused" logger="UnhandledError" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.530780 4952 policy_none.go:49] "None policy: Start" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.531450 4952 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.531490 4952 state_mem.go:35] "Initializing new in-memory state store" Nov 22 02:53:56 crc kubenswrapper[4952]: E1122 02:53:56.546284 4952 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.606148 4952 manager.go:334] "Starting Device Plugin manager" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.606211 4952 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.606228 4952 server.go:79] "Starting device plugin registration server" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.606924 4952 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.606948 4952 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.607238 4952 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.607491 4952 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.607529 4952 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 22 02:53:56 crc kubenswrapper[4952]: E1122 02:53:56.619957 4952 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.630284 4952 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.630364 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.631474 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.631505 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.631518 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.631670 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.632180 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.632244 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.632956 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.632985 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.633010 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.633087 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.633422 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.633640 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.633993 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.634043 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.634057 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.634376 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.634442 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.634454 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.634725 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.635162 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.635417 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.638898 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.638959 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.638980 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.639315 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.639354 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.639368 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.639844 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.640197 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.640259 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.640304 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.640366 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.640394 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.642201 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.642252 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.642350 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.642393 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.642296 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.642460 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.643174 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.643300 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.645641 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.645708 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.645734 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4952]: E1122 02:53:56.653310 4952 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.144:6443: connect: connection refused" interval="400ms" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.686736 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.686826 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.686958 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.687029 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.687084 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.687208 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.687295 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.687341 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.687376 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.687431 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.687475 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.687505 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.687534 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.687602 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.687657 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.707173 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.708586 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.708647 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.708668 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.708703 4952 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 02:53:56 crc kubenswrapper[4952]: E1122 02:53:56.709287 4952 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.144:6443: connect: connection refused" node="crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.788561 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.788614 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.788816 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.788880 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.788928 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.789008 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.789072 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.789095 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.789176 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.789190 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.789238 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.789335 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.789367 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.789381 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.789508 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.789574 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.789588 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.789576 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.789615 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.789641 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.789598 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.789782 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.789870 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.789893 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.789915 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.790002 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.790021 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.790050 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.790147 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.790156 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.801519 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.811708 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.865618 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-3a57b30e171e5ecd588fac7b20105b2965ef9fb5ff7b47eca6c9842019324306 WatchSource:0}: Error finding container 3a57b30e171e5ecd588fac7b20105b2965ef9fb5ff7b47eca6c9842019324306: Status 404 returned error can't find the container with id 3a57b30e171e5ecd588fac7b20105b2965ef9fb5ff7b47eca6c9842019324306 Nov 22 02:53:56 crc kubenswrapper[4952]: W1122 02:53:56.871434 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-1d6db72a1a5b60366f19ddf4a4ce191d13858977f0f4bfb0c8f842276dfee4d7 WatchSource:0}: Error finding container 1d6db72a1a5b60366f19ddf4a4ce191d13858977f0f4bfb0c8f842276dfee4d7: Status 404 returned error can't find the container with id 1d6db72a1a5b60366f19ddf4a4ce191d13858977f0f4bfb0c8f842276dfee4d7 Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.909504 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.911268 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.911321 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.911350 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.911393 4952 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 02:53:56 crc kubenswrapper[4952]: E1122 02:53:56.912150 4952 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.144:6443: connect: connection refused" node="crc" Nov 22 02:53:56 crc kubenswrapper[4952]: I1122 02:53:56.994078 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:53:57 crc kubenswrapper[4952]: I1122 02:53:57.007895 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 02:53:57 crc kubenswrapper[4952]: W1122 02:53:57.010215 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-9791c9bfbd103ea00338fa9f34eb1e52ddaeed93414b2b7bac5da2b10b81379e WatchSource:0}: Error finding container 9791c9bfbd103ea00338fa9f34eb1e52ddaeed93414b2b7bac5da2b10b81379e: Status 404 returned error can't find the container with id 9791c9bfbd103ea00338fa9f34eb1e52ddaeed93414b2b7bac5da2b10b81379e Nov 22 02:53:57 crc kubenswrapper[4952]: W1122 02:53:57.035460 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-dd9364d411da8a1194130793f54b6d617e7cf75849080065e69c636ce4e52457 WatchSource:0}: Error finding container dd9364d411da8a1194130793f54b6d617e7cf75849080065e69c636ce4e52457: Status 404 returned error can't find the container with id dd9364d411da8a1194130793f54b6d617e7cf75849080065e69c636ce4e52457 Nov 22 02:53:57 crc kubenswrapper[4952]: E1122 02:53:57.054602 4952 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.144:6443: connect: connection refused" interval="800ms" Nov 22 02:53:57 crc kubenswrapper[4952]: I1122 02:53:57.080105 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 02:53:57 crc kubenswrapper[4952]: W1122 02:53:57.096500 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-3a59cfd67138e7c00f57186e85e841ff47f9b0a6825b23acec1e187adb59b03e WatchSource:0}: Error finding container 3a59cfd67138e7c00f57186e85e841ff47f9b0a6825b23acec1e187adb59b03e: Status 404 returned error can't find the container with id 3a59cfd67138e7c00f57186e85e841ff47f9b0a6825b23acec1e187adb59b03e Nov 22 02:53:57 crc kubenswrapper[4952]: W1122 02:53:57.293104 4952 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Nov 22 02:53:57 crc kubenswrapper[4952]: E1122 02:53:57.293285 4952 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.144:6443: connect: connection refused" logger="UnhandledError" Nov 22 02:53:57 crc kubenswrapper[4952]: I1122 02:53:57.312262 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:57 crc kubenswrapper[4952]: I1122 02:53:57.313506 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:57 crc kubenswrapper[4952]: I1122 02:53:57.313570 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:57 crc kubenswrapper[4952]: I1122 02:53:57.313585 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:57 crc kubenswrapper[4952]: I1122 02:53:57.313614 4952 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 02:53:57 crc kubenswrapper[4952]: E1122 02:53:57.313930 4952 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.144:6443: connect: connection refused" node="crc" Nov 22 02:53:57 crc kubenswrapper[4952]: I1122 02:53:57.442403 4952 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Nov 22 02:53:57 crc kubenswrapper[4952]: I1122 02:53:57.535773 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3a59cfd67138e7c00f57186e85e841ff47f9b0a6825b23acec1e187adb59b03e"} Nov 22 02:53:57 crc kubenswrapper[4952]: I1122 02:53:57.538355 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"dd9364d411da8a1194130793f54b6d617e7cf75849080065e69c636ce4e52457"} Nov 22 02:53:57 crc kubenswrapper[4952]: I1122 02:53:57.541656 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396"} Nov 22 02:53:57 crc kubenswrapper[4952]: I1122 02:53:57.541688 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9791c9bfbd103ea00338fa9f34eb1e52ddaeed93414b2b7bac5da2b10b81379e"} Nov 22 02:53:57 crc kubenswrapper[4952]: I1122 02:53:57.544911 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3a57b30e171e5ecd588fac7b20105b2965ef9fb5ff7b47eca6c9842019324306"} Nov 22 02:53:57 crc kubenswrapper[4952]: I1122 02:53:57.547370 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1d6db72a1a5b60366f19ddf4a4ce191d13858977f0f4bfb0c8f842276dfee4d7"} Nov 22 02:53:57 crc kubenswrapper[4952]: W1122 02:53:57.629829 4952 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Nov 22 02:53:57 crc kubenswrapper[4952]: E1122 02:53:57.629935 4952 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.144:6443: connect: connection refused" logger="UnhandledError" Nov 22 02:53:57 crc kubenswrapper[4952]: E1122 02:53:57.856012 4952 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.144:6443: connect: connection refused" interval="1.6s" Nov 22 02:53:57 crc kubenswrapper[4952]: W1122 02:53:57.889280 4952 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Nov 22 02:53:57 crc kubenswrapper[4952]: E1122 02:53:57.889422 4952 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.144:6443: connect: connection refused" logger="UnhandledError" Nov 22 02:53:57 crc kubenswrapper[4952]: W1122 02:53:57.929423 4952 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Nov 22 02:53:57 crc kubenswrapper[4952]: E1122 02:53:57.929535 4952 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.144:6443: connect: connection refused" logger="UnhandledError" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.114607 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.116459 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.116506 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.116517 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.116565 4952 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 02:53:58 crc kubenswrapper[4952]: E1122 02:53:58.117174 4952 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.144:6443: connect: connection refused" node="crc" Nov 22 02:53:58 crc kubenswrapper[4952]: E1122 02:53:58.246934 4952 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.144:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187a349925f9fe64 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-22 02:53:56.440751716 +0000 UTC m=+0.746769029,LastTimestamp:2025-11-22 02:53:56.440751716 +0000 UTC m=+0.746769029,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.442697 4952 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.554707 4952 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559" exitCode=0 Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.554790 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559"} Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.555124 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.556680 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.556712 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.556724 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.558767 4952 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c" exitCode=0 Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.558858 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.558938 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c"} Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.559978 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.560052 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.560083 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.562449 4952 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="af20c0e9e54a71a3edb2da902a21eac2c66032a80ac644bdb2aa89e99af10630" exitCode=0 Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.562527 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"af20c0e9e54a71a3edb2da902a21eac2c66032a80ac644bdb2aa89e99af10630"} Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.562659 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.562687 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.564604 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.564630 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.564640 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.564758 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.564826 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.564845 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.569341 4952 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1" exitCode=0 Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.569422 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1"} Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.569623 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.577129 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.577210 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.577250 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.581411 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a"} Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.581485 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7"} Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.581502 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293"} Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.581949 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.583838 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.583895 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:58 crc kubenswrapper[4952]: I1122 02:53:58.583916 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.442608 4952 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Nov 22 02:53:59 crc kubenswrapper[4952]: E1122 02:53:59.457316 4952 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.144:6443: connect: connection refused" interval="3.2s" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.588098 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"66a6011947698e74ae244926c4fc492bde121b2f435911005424b9280325361a"} Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.588148 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a681e7f405c868e2932037f2542ba2aa2666f8ff23e776d0c952974398d282fe"} Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.588158 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2bbb9717ab2c2be566e304ddda4cb8e43d9010f4fa4a663bbf5734fb36399f9c"} Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.588262 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.591087 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.591120 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.591131 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.597433 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6"} Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.597472 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544"} Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.597486 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9"} Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.597496 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894"} Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.601083 4952 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4" exitCode=0 Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.601176 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4"} Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.601220 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.603924 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.603970 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.603991 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.606683 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"83b4302681a039e3cb3783b65bcda4bf5e2e0a03f656c79b871dd22105e1bf09"} Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.607051 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.607962 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.613835 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.613881 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.613893 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.614630 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.615174 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.615205 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.718046 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.719626 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.719680 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.719693 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:59 crc kubenswrapper[4952]: I1122 02:53:59.719729 4952 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 02:53:59 crc kubenswrapper[4952]: E1122 02:53:59.720320 4952 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.144:6443: connect: connection refused" node="crc" Nov 22 02:54:00 crc kubenswrapper[4952]: I1122 02:54:00.614406 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"75161cdda894dd860acef8dd3ac05eff62b25748914a8ae55d9a46ed72c42c71"} Nov 22 02:54:00 crc kubenswrapper[4952]: I1122 02:54:00.614658 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:00 crc kubenswrapper[4952]: I1122 02:54:00.616113 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:00 crc kubenswrapper[4952]: I1122 02:54:00.616157 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:00 crc kubenswrapper[4952]: I1122 02:54:00.616173 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:00 crc kubenswrapper[4952]: I1122 02:54:00.620688 4952 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb" exitCode=0 Nov 22 02:54:00 crc kubenswrapper[4952]: I1122 02:54:00.620816 4952 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 02:54:00 crc kubenswrapper[4952]: I1122 02:54:00.620917 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:00 crc kubenswrapper[4952]: I1122 02:54:00.621589 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:00 crc kubenswrapper[4952]: I1122 02:54:00.621962 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb"} Nov 22 02:54:00 crc kubenswrapper[4952]: I1122 02:54:00.622037 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:00 crc kubenswrapper[4952]: I1122 02:54:00.622853 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:00 crc kubenswrapper[4952]: I1122 02:54:00.622919 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:00 crc kubenswrapper[4952]: I1122 02:54:00.622935 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:00 crc kubenswrapper[4952]: I1122 02:54:00.623708 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:00 crc kubenswrapper[4952]: I1122 02:54:00.623735 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:00 crc kubenswrapper[4952]: I1122 02:54:00.623748 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:00 crc kubenswrapper[4952]: I1122 02:54:00.624007 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:00 crc kubenswrapper[4952]: I1122 02:54:00.624043 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:00 crc kubenswrapper[4952]: I1122 02:54:00.624056 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:01 crc kubenswrapper[4952]: I1122 02:54:01.628507 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0"} Nov 22 02:54:01 crc kubenswrapper[4952]: I1122 02:54:01.628587 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382"} Nov 22 02:54:01 crc kubenswrapper[4952]: I1122 02:54:01.628602 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c"} Nov 22 02:54:01 crc kubenswrapper[4952]: I1122 02:54:01.628624 4952 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 02:54:01 crc kubenswrapper[4952]: I1122 02:54:01.628690 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:01 crc kubenswrapper[4952]: I1122 02:54:01.630624 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:01 crc kubenswrapper[4952]: I1122 02:54:01.631169 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:01 crc kubenswrapper[4952]: I1122 02:54:01.631224 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:01 crc kubenswrapper[4952]: I1122 02:54:01.794342 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:54:01 crc kubenswrapper[4952]: I1122 02:54:01.921900 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:54:01 crc kubenswrapper[4952]: I1122 02:54:01.922183 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:01 crc kubenswrapper[4952]: I1122 02:54:01.923732 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:01 crc kubenswrapper[4952]: I1122 02:54:01.923806 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:01 crc kubenswrapper[4952]: I1122 02:54:01.923825 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:01 crc kubenswrapper[4952]: I1122 02:54:01.929701 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:54:01 crc kubenswrapper[4952]: I1122 02:54:01.964923 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 02:54:01 crc kubenswrapper[4952]: I1122 02:54:01.965252 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:01 crc kubenswrapper[4952]: I1122 02:54:01.966746 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:01 crc kubenswrapper[4952]: I1122 02:54:01.966792 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:01 crc kubenswrapper[4952]: I1122 02:54:01.966808 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:02 crc kubenswrapper[4952]: I1122 02:54:02.638940 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:02 crc kubenswrapper[4952]: I1122 02:54:02.639879 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:02 crc kubenswrapper[4952]: I1122 02:54:02.640249 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5"} Nov 22 02:54:02 crc kubenswrapper[4952]: I1122 02:54:02.640327 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f"} Nov 22 02:54:02 crc kubenswrapper[4952]: I1122 02:54:02.640475 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:02 crc kubenswrapper[4952]: I1122 02:54:02.640924 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:02 crc kubenswrapper[4952]: I1122 02:54:02.640963 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:02 crc kubenswrapper[4952]: I1122 02:54:02.640980 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:02 crc kubenswrapper[4952]: I1122 02:54:02.641049 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:02 crc kubenswrapper[4952]: I1122 02:54:02.641148 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:02 crc kubenswrapper[4952]: I1122 02:54:02.641194 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:02 crc kubenswrapper[4952]: I1122 02:54:02.642034 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:02 crc kubenswrapper[4952]: I1122 02:54:02.642133 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:02 crc kubenswrapper[4952]: I1122 02:54:02.642153 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:02 crc kubenswrapper[4952]: I1122 02:54:02.920916 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:02 crc kubenswrapper[4952]: I1122 02:54:02.922896 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:02 crc kubenswrapper[4952]: I1122 02:54:02.922979 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:02 crc kubenswrapper[4952]: I1122 02:54:02.923002 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:02 crc kubenswrapper[4952]: I1122 02:54:02.923047 4952 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 02:54:03 crc kubenswrapper[4952]: I1122 02:54:03.005949 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 22 02:54:03 crc kubenswrapper[4952]: I1122 02:54:03.074809 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:54:03 crc kubenswrapper[4952]: I1122 02:54:03.641482 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:03 crc kubenswrapper[4952]: I1122 02:54:03.641884 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:03 crc kubenswrapper[4952]: I1122 02:54:03.642879 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:03 crc kubenswrapper[4952]: I1122 02:54:03.642935 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:03 crc kubenswrapper[4952]: I1122 02:54:03.642954 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:03 crc kubenswrapper[4952]: I1122 02:54:03.643008 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:03 crc kubenswrapper[4952]: I1122 02:54:03.643093 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:03 crc kubenswrapper[4952]: I1122 02:54:03.643123 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:04 crc kubenswrapper[4952]: I1122 02:54:04.039668 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:54:04 crc kubenswrapper[4952]: I1122 02:54:04.645539 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:04 crc kubenswrapper[4952]: I1122 02:54:04.645635 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:04 crc kubenswrapper[4952]: I1122 02:54:04.647430 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:04 crc kubenswrapper[4952]: I1122 02:54:04.647487 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:04 crc kubenswrapper[4952]: I1122 02:54:04.647503 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:04 crc kubenswrapper[4952]: I1122 02:54:04.647586 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:04 crc kubenswrapper[4952]: I1122 02:54:04.647663 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:04 crc kubenswrapper[4952]: I1122 02:54:04.647687 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:05 crc kubenswrapper[4952]: I1122 02:54:05.234105 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 22 02:54:05 crc kubenswrapper[4952]: I1122 02:54:05.649230 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:05 crc kubenswrapper[4952]: I1122 02:54:05.650789 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:05 crc kubenswrapper[4952]: I1122 02:54:05.650850 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:05 crc kubenswrapper[4952]: I1122 02:54:05.650864 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:05 crc kubenswrapper[4952]: I1122 02:54:05.818790 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:54:05 crc kubenswrapper[4952]: I1122 02:54:05.819079 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:05 crc kubenswrapper[4952]: I1122 02:54:05.821092 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:05 crc kubenswrapper[4952]: I1122 02:54:05.821176 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:05 crc kubenswrapper[4952]: I1122 02:54:05.821198 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:06 crc kubenswrapper[4952]: E1122 02:54:06.620328 4952 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 22 02:54:07 crc kubenswrapper[4952]: I1122 02:54:07.636645 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:54:07 crc kubenswrapper[4952]: I1122 02:54:07.636959 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:07 crc kubenswrapper[4952]: I1122 02:54:07.638852 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:07 crc kubenswrapper[4952]: I1122 02:54:07.638920 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:07 crc kubenswrapper[4952]: I1122 02:54:07.638947 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:07 crc kubenswrapper[4952]: I1122 02:54:07.644383 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:54:07 crc kubenswrapper[4952]: I1122 02:54:07.655213 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:07 crc kubenswrapper[4952]: I1122 02:54:07.656761 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:07 crc kubenswrapper[4952]: I1122 02:54:07.656816 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:07 crc kubenswrapper[4952]: I1122 02:54:07.656839 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:08 crc kubenswrapper[4952]: I1122 02:54:08.111174 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:54:08 crc kubenswrapper[4952]: I1122 02:54:08.658728 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:08 crc kubenswrapper[4952]: I1122 02:54:08.660200 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:08 crc kubenswrapper[4952]: I1122 02:54:08.660265 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:08 crc kubenswrapper[4952]: I1122 02:54:08.660293 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:10 crc kubenswrapper[4952]: W1122 02:54:10.356642 4952 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 22 02:54:10 crc kubenswrapper[4952]: I1122 02:54:10.357382 4952 trace.go:236] Trace[1517062513]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 02:54:00.355) (total time: 10001ms): Nov 22 02:54:10 crc kubenswrapper[4952]: Trace[1517062513]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (02:54:10.356) Nov 22 02:54:10 crc kubenswrapper[4952]: Trace[1517062513]: [10.001700149s] [10.001700149s] END Nov 22 02:54:10 crc kubenswrapper[4952]: E1122 02:54:10.357426 4952 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 22 02:54:10 crc kubenswrapper[4952]: W1122 02:54:10.359741 4952 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 22 02:54:10 crc kubenswrapper[4952]: I1122 02:54:10.359825 4952 trace.go:236] Trace[890188385]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 02:54:00.358) (total time: 10001ms): Nov 22 02:54:10 crc kubenswrapper[4952]: Trace[890188385]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (02:54:10.359) Nov 22 02:54:10 crc kubenswrapper[4952]: Trace[890188385]: [10.001015734s] [10.001015734s] END Nov 22 02:54:10 crc kubenswrapper[4952]: E1122 02:54:10.359850 4952 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 22 02:54:10 crc kubenswrapper[4952]: I1122 02:54:10.443202 4952 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Nov 22 02:54:10 crc kubenswrapper[4952]: W1122 02:54:10.469291 4952 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 22 02:54:10 crc kubenswrapper[4952]: I1122 02:54:10.469446 4952 trace.go:236] Trace[1620405009]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 02:54:00.467) (total time: 10001ms): Nov 22 02:54:10 crc kubenswrapper[4952]: Trace[1620405009]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (02:54:10.469) Nov 22 02:54:10 crc kubenswrapper[4952]: Trace[1620405009]: [10.001890778s] [10.001890778s] END Nov 22 02:54:10 crc kubenswrapper[4952]: E1122 02:54:10.469487 4952 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 22 02:54:10 crc kubenswrapper[4952]: I1122 02:54:10.636871 4952 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 22 02:54:10 crc kubenswrapper[4952]: I1122 02:54:10.636963 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 22 02:54:10 crc kubenswrapper[4952]: I1122 02:54:10.667210 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 22 02:54:10 crc kubenswrapper[4952]: I1122 02:54:10.669185 4952 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="75161cdda894dd860acef8dd3ac05eff62b25748914a8ae55d9a46ed72c42c71" exitCode=255 Nov 22 02:54:10 crc kubenswrapper[4952]: I1122 02:54:10.669235 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"75161cdda894dd860acef8dd3ac05eff62b25748914a8ae55d9a46ed72c42c71"} Nov 22 02:54:10 crc kubenswrapper[4952]: I1122 02:54:10.669433 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:10 crc kubenswrapper[4952]: I1122 02:54:10.674172 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:10 crc kubenswrapper[4952]: I1122 02:54:10.674329 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:10 crc kubenswrapper[4952]: I1122 02:54:10.674345 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:10 crc kubenswrapper[4952]: I1122 02:54:10.676035 4952 scope.go:117] "RemoveContainer" containerID="75161cdda894dd860acef8dd3ac05eff62b25748914a8ae55d9a46ed72c42c71" Nov 22 02:54:10 crc kubenswrapper[4952]: W1122 02:54:10.797032 4952 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 22 02:54:10 crc kubenswrapper[4952]: I1122 02:54:10.797179 4952 trace.go:236] Trace[260617315]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 02:54:00.795) (total time: 10001ms): Nov 22 02:54:10 crc kubenswrapper[4952]: Trace[260617315]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (02:54:10.797) Nov 22 02:54:10 crc kubenswrapper[4952]: Trace[260617315]: [10.001454441s] [10.001454441s] END Nov 22 02:54:10 crc kubenswrapper[4952]: E1122 02:54:10.797214 4952 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 22 02:54:11 crc kubenswrapper[4952]: I1122 02:54:11.341400 4952 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 22 02:54:11 crc kubenswrapper[4952]: I1122 02:54:11.341474 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 22 02:54:11 crc kubenswrapper[4952]: I1122 02:54:11.347235 4952 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 22 02:54:11 crc kubenswrapper[4952]: I1122 02:54:11.347344 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 22 02:54:11 crc kubenswrapper[4952]: I1122 02:54:11.676315 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 22 02:54:11 crc kubenswrapper[4952]: I1122 02:54:11.678869 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8"} Nov 22 02:54:11 crc kubenswrapper[4952]: I1122 02:54:11.679038 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:11 crc kubenswrapper[4952]: I1122 02:54:11.680195 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:11 crc kubenswrapper[4952]: I1122 02:54:11.680280 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:11 crc kubenswrapper[4952]: I1122 02:54:11.680303 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:11 crc kubenswrapper[4952]: I1122 02:54:11.794695 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:54:12 crc kubenswrapper[4952]: I1122 02:54:12.682120 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:12 crc kubenswrapper[4952]: I1122 02:54:12.683500 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:12 crc kubenswrapper[4952]: I1122 02:54:12.683585 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:12 crc kubenswrapper[4952]: I1122 02:54:12.683604 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:14 crc kubenswrapper[4952]: I1122 02:54:14.011616 4952 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 22 02:54:14 crc kubenswrapper[4952]: I1122 02:54:14.048820 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:54:14 crc kubenswrapper[4952]: I1122 02:54:14.049085 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:14 crc kubenswrapper[4952]: I1122 02:54:14.050698 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:14 crc kubenswrapper[4952]: I1122 02:54:14.050793 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:14 crc kubenswrapper[4952]: I1122 02:54:14.050814 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:14 crc kubenswrapper[4952]: I1122 02:54:14.056665 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:54:14 crc kubenswrapper[4952]: I1122 02:54:14.689219 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:14 crc kubenswrapper[4952]: I1122 02:54:14.691445 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:14 crc kubenswrapper[4952]: I1122 02:54:14.691503 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:14 crc kubenswrapper[4952]: I1122 02:54:14.691523 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:14 crc kubenswrapper[4952]: I1122 02:54:14.820739 4952 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.273953 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.292596 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.435976 4952 apiserver.go:52] "Watching apiserver" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.442588 4952 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.443150 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-etcd/etcd-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.443798 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.443866 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.443938 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.443984 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 02:54:15 crc kubenswrapper[4952]: E1122 02:54:15.444202 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.444366 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 02:54:15 crc kubenswrapper[4952]: E1122 02:54:15.444430 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.444934 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:15 crc kubenswrapper[4952]: E1122 02:54:15.445011 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.446754 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.447165 4952 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.447217 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.447216 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.447428 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.447919 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.447978 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.448043 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.448126 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.448166 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.491497 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.524498 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.543409 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.562799 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.579948 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.593271 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.610695 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:54:15 crc kubenswrapper[4952]: I1122 02:54:15.626534 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:54:15 crc kubenswrapper[4952]: E1122 02:54:15.707992 4952 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Nov 22 02:54:16 crc kubenswrapper[4952]: E1122 02:54:16.339597 4952 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Nov 22 02:54:16 crc kubenswrapper[4952]: E1122 02:54:16.344633 4952 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.344964 4952 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.358280 4952 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.445470 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.445539 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.445613 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.445648 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.445699 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.445731 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.445807 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.445841 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.445934 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.445970 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446021 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446055 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446127 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446160 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446195 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446227 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446283 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446317 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446349 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446380 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446415 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446446 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446478 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446509 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446539 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446624 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446657 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446689 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446720 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446749 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446780 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446812 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446876 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446906 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446941 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446973 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447003 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447032 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447068 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447100 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447135 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447184 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447219 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447252 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447285 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447317 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447348 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447384 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447422 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447460 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446003 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446100 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446205 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446353 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.446887 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447158 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447278 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447339 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447352 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447490 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447498 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447740 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447772 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447779 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447773 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447874 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447904 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447927 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447953 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447974 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.447995 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448018 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448039 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448131 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448161 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448195 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448231 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448270 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448296 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448319 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448342 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448370 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448397 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448420 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448446 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448476 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448502 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448524 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448566 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448591 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448640 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448665 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448694 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448717 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448740 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448767 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448791 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448813 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448835 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448862 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448885 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448906 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448929 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448950 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.448975 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.449000 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.449024 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.449047 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.449070 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.449092 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.449118 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.449147 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.449125 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.449181 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.449308 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.449350 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.449401 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.449485 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: E1122 02:54:16.449758 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:54:16.949727442 +0000 UTC m=+21.255744725 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.454500 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.454906 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.449527 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.455004 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.455104 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.455100 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.455143 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.455235 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.455277 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.455318 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.455355 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.455389 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.455426 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.455461 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.455499 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.455608 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.455947 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456020 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456076 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456129 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456190 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456246 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456296 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456346 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456402 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456453 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456501 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456575 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456611 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456648 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456687 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456727 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456779 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456829 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456887 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456955 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.457065 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.457332 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.457384 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.457428 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.457469 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.457513 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.457603 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.457671 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.457727 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.457782 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.457833 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.457870 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.457947 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.457984 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458017 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458052 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458085 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458125 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458162 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458198 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458237 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458271 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458306 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458342 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458377 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458410 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458443 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458479 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458513 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458580 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458616 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458649 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458683 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458724 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458759 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458795 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458828 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458863 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458901 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458938 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458972 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459008 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459043 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459076 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459111 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459147 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459185 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459220 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459258 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459296 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459331 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459364 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459398 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459432 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459466 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459516 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459593 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459645 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459701 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459754 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459852 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459921 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459979 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460019 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460063 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460111 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460150 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460187 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460223 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460260 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460303 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460342 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460382 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460422 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460500 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460538 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460591 4952 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460614 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460637 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460659 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460683 4952 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460705 4952 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460727 4952 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460751 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460773 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460793 4952 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460814 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460836 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460857 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460878 4952 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460898 4952 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460919 4952 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460940 4952 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.455164 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.455435 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.455519 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.468857 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456181 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456301 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456357 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456640 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456754 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.456807 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.457020 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.457082 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.469024 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.457091 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.457380 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.457737 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458205 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458250 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458684 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.469300 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.458911 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459004 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.459691 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460566 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460689 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.469430 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460668 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.460895 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.461069 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.461126 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.461186 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.461258 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.461313 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.463909 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.464319 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.467607 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.468227 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.468311 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.468625 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.468701 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.455627 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.469870 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.470008 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.470130 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.470377 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.470758 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.471010 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.477495 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.477740 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.478319 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.478441 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.478636 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.478684 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.478684 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.478720 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.478834 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.479303 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.479394 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.479767 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.479969 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.480939 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.481627 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.482010 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.482431 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.482765 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.483025 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.483069 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.483092 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.483100 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.483319 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.483388 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.483426 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.483637 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.483684 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.484171 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.484896 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.484921 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.485041 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.485989 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 02:54:16 crc kubenswrapper[4952]: E1122 02:54:16.486519 4952 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:54:16 crc kubenswrapper[4952]: E1122 02:54:16.486697 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:16.986663767 +0000 UTC m=+21.292681230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.486826 4952 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.486969 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: E1122 02:54:16.486986 4952 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:54:16 crc kubenswrapper[4952]: E1122 02:54:16.487155 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:16.987118478 +0000 UTC m=+21.293135961 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.487770 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.487902 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.488385 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.489800 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.499767 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.499998 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.500357 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.500404 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.500576 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.500674 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.505274 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.505293 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.505849 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.505977 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.506240 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.506590 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.506633 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.506792 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.506880 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.506950 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.507192 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.507647 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.507828 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.508293 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.508470 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.508688 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.509062 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.509453 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.509502 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.509781 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.510035 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.510173 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.510365 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.510897 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.510950 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.510985 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.510978 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.511106 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.511388 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.511415 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.511411 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.511535 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.511991 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.512259 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.512530 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.512641 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: E1122 02:54:16.512904 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:54:16 crc kubenswrapper[4952]: E1122 02:54:16.512973 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.512980 4952 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 22 02:54:16 crc kubenswrapper[4952]: E1122 02:54:16.513224 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.513234 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: E1122 02:54:16.512996 4952 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:16 crc kubenswrapper[4952]: E1122 02:54:16.513260 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:54:16 crc kubenswrapper[4952]: E1122 02:54:16.513353 4952 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.513390 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: E1122 02:54:16.513429 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:17.013385263 +0000 UTC m=+21.319402536 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:16 crc kubenswrapper[4952]: E1122 02:54:16.513511 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:17.013465135 +0000 UTC m=+21.319482448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.513828 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.514474 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.514474 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.516350 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.516604 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.517208 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.517763 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.517964 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.519689 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.519796 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.520036 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.520378 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.520464 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.520744 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.520734 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.521011 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.521206 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.521451 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.521948 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.521949 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.522346 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.522474 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.523979 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.524158 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.525114 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.525938 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.526067 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.532462 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.532501 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.532861 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.533259 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.533893 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.534255 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.534264 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.534279 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.534355 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.534460 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.535201 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.535332 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.537383 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.537460 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.538870 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.539329 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.539402 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.539857 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.540069 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.540503 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.540693 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.544023 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.549513 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.549622 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.549773 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.551214 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.552867 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.554346 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.554535 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.558173 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.562524 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.564031 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.563872 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.564433 4952 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.564631 4952 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.564818 4952 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.565024 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.565208 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.565377 4952 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.564120 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.564789 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.565800 4952 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.565967 4952 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.566080 4952 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.566190 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.566308 4952 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.566422 4952 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.566577 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.566735 4952 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.566858 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.566969 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.567077 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.567193 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.567318 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.567433 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.567602 4952 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.567732 4952 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.567855 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.567970 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.566813 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.568090 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.568322 4952 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.568337 4952 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.568351 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.568363 4952 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.568375 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.568387 4952 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.568400 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.568411 4952 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.568426 4952 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.568437 4952 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.568449 4952 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.568462 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.568473 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.568485 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.568497 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.568510 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.568522 4952 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.568534 4952 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.569172 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.569186 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.569198 4952 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.569210 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.569222 4952 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.569233 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.569244 4952 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.568126 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.569256 4952 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.569385 4952 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.569404 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.569421 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.569438 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.569452 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.569465 4952 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.569479 4952 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.569492 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.569750 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570374 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570394 4952 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570408 4952 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570419 4952 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570444 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570457 4952 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570470 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570483 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570494 4952 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570508 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570521 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570533 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570563 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570753 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570774 4952 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570789 4952 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570802 4952 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570816 4952 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570828 4952 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570841 4952 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570854 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570865 4952 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570877 4952 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570889 4952 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570902 4952 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570914 4952 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570928 4952 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570940 4952 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570952 4952 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570964 4952 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570976 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.570989 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571002 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571015 4952 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571028 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571041 4952 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571053 4952 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571067 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571083 4952 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571095 4952 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571108 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571119 4952 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571132 4952 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571144 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571157 4952 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571193 4952 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571248 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571262 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571275 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571288 4952 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571301 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571314 4952 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571327 4952 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571338 4952 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571350 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571362 4952 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571374 4952 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571387 4952 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571400 4952 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571412 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571425 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571437 4952 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571450 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571461 4952 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571473 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571481 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571485 4952 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571535 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571660 4952 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571679 4952 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571705 4952 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571728 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571741 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571753 4952 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571766 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571779 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571791 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571804 4952 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571817 4952 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571833 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571846 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571857 4952 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571869 4952 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571880 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571892 4952 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571903 4952 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571915 4952 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571928 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571941 4952 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571955 4952 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571967 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571980 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.571993 4952 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572005 4952 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572017 4952 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572029 4952 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572041 4952 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572053 4952 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572064 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572076 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572088 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572101 4952 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572115 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572128 4952 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572140 4952 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572153 4952 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572193 4952 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572206 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572218 4952 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572231 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572245 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572260 4952 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572273 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.572286 4952 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.573788 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.575575 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.576616 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.577888 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.578463 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.580221 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.583347 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.584112 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.585026 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.586069 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.586495 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.588221 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.593141 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.594054 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.599334 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.599973 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.600000 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.601265 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.603257 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.604121 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.605156 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.606185 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.606305 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.607788 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.608284 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.609504 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.610028 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.610728 4952 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.611314 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.613881 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.614536 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.615119 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.616063 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.618432 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.619347 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.620663 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.621496 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.623171 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.623874 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.625180 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.626282 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.627083 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.628175 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.628944 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.630855 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.631937 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.633071 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.633693 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.634805 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.635479 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.636408 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.637523 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.641639 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.654125 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.664774 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.673119 4952 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.673150 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.673164 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.673383 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.679426 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 02:54:16 crc kubenswrapper[4952]: W1122 02:54:16.682172 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-232654d454f9696c975983bc09deaec88ad1202d8bac83c4447c5e64831ce78b WatchSource:0}: Error finding container 232654d454f9696c975983bc09deaec88ad1202d8bac83c4447c5e64831ce78b: Status 404 returned error can't find the container with id 232654d454f9696c975983bc09deaec88ad1202d8bac83c4447c5e64831ce78b Nov 22 02:54:16 crc kubenswrapper[4952]: W1122 02:54:16.688373 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-caa44ea15e7e3ef3962b5add8a4fed102894c74a3703ae1db33bf1381e1fc261 WatchSource:0}: Error finding container caa44ea15e7e3ef3962b5add8a4fed102894c74a3703ae1db33bf1381e1fc261: Status 404 returned error can't find the container with id caa44ea15e7e3ef3962b5add8a4fed102894c74a3703ae1db33bf1381e1fc261 Nov 22 02:54:16 crc kubenswrapper[4952]: W1122 02:54:16.694778 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-fe50b625cd74d1efe48bd4b6213789c37eb1f5607e651b4a39422011fd79c360 WatchSource:0}: Error finding container fe50b625cd74d1efe48bd4b6213789c37eb1f5607e651b4a39422011fd79c360: Status 404 returned error can't find the container with id fe50b625cd74d1efe48bd4b6213789c37eb1f5607e651b4a39422011fd79c360 Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.696434 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"caa44ea15e7e3ef3962b5add8a4fed102894c74a3703ae1db33bf1381e1fc261"} Nov 22 02:54:16 crc kubenswrapper[4952]: I1122 02:54:16.698306 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"232654d454f9696c975983bc09deaec88ad1202d8bac83c4447c5e64831ce78b"} Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.027681 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.027757 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.027787 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.027809 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.027834 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:17 crc kubenswrapper[4952]: E1122 02:54:17.027939 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:54:18.027894217 +0000 UTC m=+22.333911490 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:54:17 crc kubenswrapper[4952]: E1122 02:54:17.027952 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:54:17 crc kubenswrapper[4952]: E1122 02:54:17.027999 4952 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:54:17 crc kubenswrapper[4952]: E1122 02:54:17.028025 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:54:17 crc kubenswrapper[4952]: E1122 02:54:17.028047 4952 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:17 crc kubenswrapper[4952]: E1122 02:54:17.028071 4952 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:54:17 crc kubenswrapper[4952]: E1122 02:54:17.028094 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:18.028073402 +0000 UTC m=+22.334090675 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:54:17 crc kubenswrapper[4952]: E1122 02:54:17.028248 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:18.028198795 +0000 UTC m=+22.334216248 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:17 crc kubenswrapper[4952]: E1122 02:54:17.028014 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:54:17 crc kubenswrapper[4952]: E1122 02:54:17.028359 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:54:17 crc kubenswrapper[4952]: E1122 02:54:17.028392 4952 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:17 crc kubenswrapper[4952]: E1122 02:54:17.028273 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:18.028264856 +0000 UTC m=+22.334282139 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:54:17 crc kubenswrapper[4952]: E1122 02:54:17.028490 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:18.028470561 +0000 UTC m=+22.334487834 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.530724 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.530762 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.530724 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:17 crc kubenswrapper[4952]: E1122 02:54:17.530879 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:17 crc kubenswrapper[4952]: E1122 02:54:17.531030 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:17 crc kubenswrapper[4952]: E1122 02:54:17.531297 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.642404 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.648351 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.659314 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.660963 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.675219 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-j9kg2"] Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.675567 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.675936 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.676577 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-x6nk8"] Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.676945 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-x6nk8" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.679454 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.680134 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.680145 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.680451 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.680531 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.680534 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.680313 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.683400 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.702557 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395"} Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.702623 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7"} Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.702639 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"fe50b625cd74d1efe48bd4b6213789c37eb1f5607e651b4a39422011fd79c360"} Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.704386 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.704939 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.706523 4952 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8" exitCode=255 Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.706606 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8"} Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.706701 4952 scope.go:117] "RemoveContainer" containerID="75161cdda894dd860acef8dd3ac05eff62b25748914a8ae55d9a46ed72c42c71" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.707961 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5"} Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.713877 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:17 crc kubenswrapper[4952]: E1122 02:54:17.728883 4952 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.743456 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.766122 4952 scope.go:117] "RemoveContainer" containerID="ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.766362 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 22 02:54:17 crc kubenswrapper[4952]: E1122 02:54:17.766418 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.773758 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.800497 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.836979 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-host-run-k8s-cni-cncf-io\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.837070 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-host-run-netns\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.837127 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/299b06f8-5ba8-425d-96a5-2866e435b986-hosts-file\") pod \"node-resolver-x6nk8\" (UID: \"299b06f8-5ba8-425d-96a5-2866e435b986\") " pod="openshift-dns/node-resolver-x6nk8" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.837145 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-os-release\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.837185 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-cnibin\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.837202 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-host-var-lib-cni-bin\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.837220 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xc9r\" (UniqueName: \"kubernetes.io/projected/ccedfe81-43b3-4af7-88c7-9953b33e7d13-kube-api-access-5xc9r\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.837241 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-system-cni-dir\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.837260 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-multus-cni-dir\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.837278 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ccedfe81-43b3-4af7-88c7-9953b33e7d13-cni-binary-copy\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.837305 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf8ws\" (UniqueName: \"kubernetes.io/projected/299b06f8-5ba8-425d-96a5-2866e435b986-kube-api-access-rf8ws\") pod \"node-resolver-x6nk8\" (UID: \"299b06f8-5ba8-425d-96a5-2866e435b986\") " pod="openshift-dns/node-resolver-x6nk8" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.837324 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-hostroot\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.837347 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ccedfe81-43b3-4af7-88c7-9953b33e7d13-multus-daemon-config\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.837469 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-etc-kubernetes\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.837593 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-multus-socket-dir-parent\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.837635 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-host-var-lib-cni-multus\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.837655 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-host-var-lib-kubelet\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.837760 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-multus-conf-dir\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.837790 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-host-run-multus-certs\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.851633 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.885781 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.913120 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.930933 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.938911 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/299b06f8-5ba8-425d-96a5-2866e435b986-hosts-file\") pod \"node-resolver-x6nk8\" (UID: \"299b06f8-5ba8-425d-96a5-2866e435b986\") " pod="openshift-dns/node-resolver-x6nk8" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.938952 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-os-release\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.938973 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xc9r\" (UniqueName: \"kubernetes.io/projected/ccedfe81-43b3-4af7-88c7-9953b33e7d13-kube-api-access-5xc9r\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939347 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-cnibin\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939085 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/299b06f8-5ba8-425d-96a5-2866e435b986-hosts-file\") pod \"node-resolver-x6nk8\" (UID: \"299b06f8-5ba8-425d-96a5-2866e435b986\") " pod="openshift-dns/node-resolver-x6nk8" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939371 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-host-var-lib-cni-bin\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939434 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-system-cni-dir\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939417 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-os-release\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939493 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-host-var-lib-cni-bin\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939449 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-multus-cni-dir\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939587 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ccedfe81-43b3-4af7-88c7-9953b33e7d13-cni-binary-copy\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939634 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf8ws\" (UniqueName: \"kubernetes.io/projected/299b06f8-5ba8-425d-96a5-2866e435b986-kube-api-access-rf8ws\") pod \"node-resolver-x6nk8\" (UID: \"299b06f8-5ba8-425d-96a5-2866e435b986\") " pod="openshift-dns/node-resolver-x6nk8" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939659 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-hostroot\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939581 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-system-cni-dir\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939680 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ccedfe81-43b3-4af7-88c7-9953b33e7d13-multus-daemon-config\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939700 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-etc-kubernetes\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939725 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-host-run-multus-certs\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939724 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-hostroot\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939754 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-multus-socket-dir-parent\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939779 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-host-var-lib-cni-multus\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939833 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-host-var-lib-kubelet\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939856 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-multus-conf-dir\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939876 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-host-var-lib-cni-multus\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939840 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-host-run-multus-certs\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939935 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-multus-cni-dir\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939922 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-etc-kubernetes\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939948 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-multus-socket-dir-parent\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.939978 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-host-var-lib-kubelet\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.940014 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-host-run-k8s-cni-cncf-io\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.940017 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-multus-conf-dir\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.940049 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-host-run-k8s-cni-cncf-io\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.940154 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-host-run-netns\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.940208 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-host-run-netns\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.940484 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ccedfe81-43b3-4af7-88c7-9953b33e7d13-multus-daemon-config\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.940530 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ccedfe81-43b3-4af7-88c7-9953b33e7d13-cni-binary-copy\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.940603 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ccedfe81-43b3-4af7-88c7-9953b33e7d13-cnibin\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.946401 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.955100 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xc9r\" (UniqueName: \"kubernetes.io/projected/ccedfe81-43b3-4af7-88c7-9953b33e7d13-kube-api-access-5xc9r\") pod \"multus-j9kg2\" (UID: \"ccedfe81-43b3-4af7-88c7-9953b33e7d13\") " pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.959654 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.963056 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf8ws\" (UniqueName: \"kubernetes.io/projected/299b06f8-5ba8-425d-96a5-2866e435b986-kube-api-access-rf8ws\") pod \"node-resolver-x6nk8\" (UID: \"299b06f8-5ba8-425d-96a5-2866e435b986\") " pod="openshift-dns/node-resolver-x6nk8" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.976219 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.990449 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-j9kg2" Nov 22 02:54:17 crc kubenswrapper[4952]: I1122 02:54:17.995605 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-x6nk8" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.000617 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: W1122 02:54:18.010790 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod299b06f8_5ba8_425d_96a5_2866e435b986.slice/crio-b7d636c1dab3807ebf96621cf8922ce8ecfc8544a2026dfb0746371d9ccb8f6c WatchSource:0}: Error finding container b7d636c1dab3807ebf96621cf8922ce8ecfc8544a2026dfb0746371d9ccb8f6c: Status 404 returned error can't find the container with id b7d636c1dab3807ebf96621cf8922ce8ecfc8544a2026dfb0746371d9ccb8f6c Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.026060 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75161cdda894dd860acef8dd3ac05eff62b25748914a8ae55d9a46ed72c42c71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:10Z\\\",\\\"message\\\":\\\"W1122 02:53:59.828002 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:53:59.828743 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763780039 cert, and key in /tmp/serving-cert-2201760489/serving-signer.crt, /tmp/serving-cert-2201760489/serving-signer.key\\\\nI1122 02:54:00.060946 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:54:00.063373 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:54:00.063524 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:00.066424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2201760489/tls.crt::/tmp/serving-cert-2201760489/tls.key\\\\\\\"\\\\nF1122 02:54:10.329345 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.043827 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.043935 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.043965 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.043988 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.044013 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:18 crc kubenswrapper[4952]: E1122 02:54:18.044041 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:54:20.044010747 +0000 UTC m=+24.350028020 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:54:18 crc kubenswrapper[4952]: E1122 02:54:18.044144 4952 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:54:18 crc kubenswrapper[4952]: E1122 02:54:18.044186 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:54:18 crc kubenswrapper[4952]: E1122 02:54:18.044202 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:54:18 crc kubenswrapper[4952]: E1122 02:54:18.044225 4952 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:18 crc kubenswrapper[4952]: E1122 02:54:18.044238 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:54:18 crc kubenswrapper[4952]: E1122 02:54:18.044210 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:20.044196252 +0000 UTC m=+24.350213525 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:54:18 crc kubenswrapper[4952]: E1122 02:54:18.044286 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:54:18 crc kubenswrapper[4952]: E1122 02:54:18.044291 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:20.044282965 +0000 UTC m=+24.350300238 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:18 crc kubenswrapper[4952]: E1122 02:54:18.044313 4952 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:18 crc kubenswrapper[4952]: E1122 02:54:18.044250 4952 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:54:18 crc kubenswrapper[4952]: E1122 02:54:18.044403 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:20.044373407 +0000 UTC m=+24.350390690 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:18 crc kubenswrapper[4952]: E1122 02:54:18.044438 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:20.044425168 +0000 UTC m=+24.350442691 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.044651 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.060751 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.067054 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qnw6b"] Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.067709 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-vn2dl"] Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.067924 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.068206 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.069787 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-ts9bc"] Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.070091 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.070304 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.070528 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.070772 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.072029 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.072064 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.072152 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.072213 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.072599 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.075685 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.075705 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.075813 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.076501 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.077759 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.078397 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.081816 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.103855 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.129566 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.144433 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.144514 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-os-release\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.144571 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-cni-bin\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.144628 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-ovn-node-metrics-cert\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.144722 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-system-cni-dir\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.144786 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.144818 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-run-openvswitch\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.144855 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-slash\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.144880 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/94f311d8-e9ac-4dd7-bc2c-321490681934-rootfs\") pod \"machine-config-daemon-vn2dl\" (UID: \"94f311d8-e9ac-4dd7-bc2c-321490681934\") " pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.144908 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94f311d8-e9ac-4dd7-bc2c-321490681934-proxy-tls\") pod \"machine-config-daemon-vn2dl\" (UID: \"94f311d8-e9ac-4dd7-bc2c-321490681934\") " pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.144952 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.144988 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-systemd-units\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.145008 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-log-socket\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.145035 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.145057 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-env-overrides\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.145080 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj94l\" (UniqueName: \"kubernetes.io/projected/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-kube-api-access-xj94l\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.145099 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-run-systemd\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.145164 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-cni-netd\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.145189 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-kubelet\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.145209 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-run-ovn\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.145231 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-ovnkube-config\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.145253 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-cni-binary-copy\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.145275 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-etc-openvswitch\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.145322 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-cnibin\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.145344 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-run-netns\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.145362 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-node-log\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.145381 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/94f311d8-e9ac-4dd7-bc2c-321490681934-mcd-auth-proxy-config\") pod \"machine-config-daemon-vn2dl\" (UID: \"94f311d8-e9ac-4dd7-bc2c-321490681934\") " pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.145410 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-var-lib-openvswitch\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.145431 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdmhx\" (UniqueName: \"kubernetes.io/projected/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-kube-api-access-jdmhx\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.145457 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmt28\" (UniqueName: \"kubernetes.io/projected/94f311d8-e9ac-4dd7-bc2c-321490681934-kube-api-access-zmt28\") pod \"machine-config-daemon-vn2dl\" (UID: \"94f311d8-e9ac-4dd7-bc2c-321490681934\") " pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.145480 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.145506 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-ovnkube-script-lib\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.159796 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.179158 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.195079 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.208220 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75161cdda894dd860acef8dd3ac05eff62b25748914a8ae55d9a46ed72c42c71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:10Z\\\",\\\"message\\\":\\\"W1122 02:53:59.828002 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:53:59.828743 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763780039 cert, and key in /tmp/serving-cert-2201760489/serving-signer.crt, /tmp/serving-cert-2201760489/serving-signer.key\\\\nI1122 02:54:00.060946 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:54:00.063373 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:54:00.063524 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:00.066424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2201760489/tls.crt::/tmp/serving-cert-2201760489/tls.key\\\\\\\"\\\\nF1122 02:54:10.329345 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.233069 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246183 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-var-lib-openvswitch\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246227 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdmhx\" (UniqueName: \"kubernetes.io/projected/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-kube-api-access-jdmhx\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246255 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmt28\" (UniqueName: \"kubernetes.io/projected/94f311d8-e9ac-4dd7-bc2c-321490681934-kube-api-access-zmt28\") pod \"machine-config-daemon-vn2dl\" (UID: \"94f311d8-e9ac-4dd7-bc2c-321490681934\") " pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246287 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-ovnkube-script-lib\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246315 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246332 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-ovn-node-metrics-cert\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246351 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-os-release\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246368 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-cni-bin\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246390 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246408 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-run-openvswitch\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246425 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-system-cni-dir\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246462 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-slash\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246479 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/94f311d8-e9ac-4dd7-bc2c-321490681934-rootfs\") pod \"machine-config-daemon-vn2dl\" (UID: \"94f311d8-e9ac-4dd7-bc2c-321490681934\") " pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246496 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94f311d8-e9ac-4dd7-bc2c-321490681934-proxy-tls\") pod \"machine-config-daemon-vn2dl\" (UID: \"94f311d8-e9ac-4dd7-bc2c-321490681934\") " pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246513 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246536 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-systemd-units\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246576 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-log-socket\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246597 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246614 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-env-overrides\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246638 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj94l\" (UniqueName: \"kubernetes.io/projected/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-kube-api-access-xj94l\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246659 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-run-systemd\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246692 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-cni-netd\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246711 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-cni-binary-copy\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246727 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-kubelet\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246745 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-run-ovn\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246763 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-ovnkube-config\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246781 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-etc-openvswitch\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246815 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-run-netns\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246834 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-node-log\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246850 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/94f311d8-e9ac-4dd7-bc2c-321490681934-mcd-auth-proxy-config\") pod \"machine-config-daemon-vn2dl\" (UID: \"94f311d8-e9ac-4dd7-bc2c-321490681934\") " pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246868 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-cnibin\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246952 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-cnibin\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.246995 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-var-lib-openvswitch\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.247808 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.248113 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-ovnkube-script-lib\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.248170 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.248394 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-os-release\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.248429 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/94f311d8-e9ac-4dd7-bc2c-321490681934-rootfs\") pod \"machine-config-daemon-vn2dl\" (UID: \"94f311d8-e9ac-4dd7-bc2c-321490681934\") " pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.248441 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-slash\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.248496 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-cni-bin\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.248598 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.248649 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-run-openvswitch\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.248682 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-system-cni-dir\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.248716 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-systemd-units\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.248876 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-log-socket\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.248912 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-kubelet\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.248928 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-etc-openvswitch\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.248958 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-cni-netd\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.248965 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-run-ovn\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.249055 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-run-systemd\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.249535 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-ovnkube-config\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.249633 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-cni-binary-copy\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.249689 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-node-log\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.249722 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-run-netns\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.249733 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-env-overrides\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.250135 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.250199 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/94f311d8-e9ac-4dd7-bc2c-321490681934-mcd-auth-proxy-config\") pod \"machine-config-daemon-vn2dl\" (UID: \"94f311d8-e9ac-4dd7-bc2c-321490681934\") " pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.253523 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94f311d8-e9ac-4dd7-bc2c-321490681934-proxy-tls\") pod \"machine-config-daemon-vn2dl\" (UID: \"94f311d8-e9ac-4dd7-bc2c-321490681934\") " pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.253582 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-ovn-node-metrics-cert\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.262562 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.268135 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmt28\" (UniqueName: \"kubernetes.io/projected/94f311d8-e9ac-4dd7-bc2c-321490681934-kube-api-access-zmt28\") pod \"machine-config-daemon-vn2dl\" (UID: \"94f311d8-e9ac-4dd7-bc2c-321490681934\") " pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.268657 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdmhx\" (UniqueName: \"kubernetes.io/projected/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-kube-api-access-jdmhx\") pod \"ovnkube-node-qnw6b\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.277253 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj94l\" (UniqueName: \"kubernetes.io/projected/b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4-kube-api-access-xj94l\") pod \"multus-additional-cni-plugins-ts9bc\" (UID: \"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\") " pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.280074 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.294931 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.310686 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.329870 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.341505 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.390744 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.404726 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.415154 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" Nov 22 02:54:18 crc kubenswrapper[4952]: W1122 02:54:18.416102 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbef051cd_2285_4b6b_a16f_1154f4d1f5dd.slice/crio-50dc2b64957657973595ac2fe0f873643bc0931db98df6606c180a3e370fef94 WatchSource:0}: Error finding container 50dc2b64957657973595ac2fe0f873643bc0931db98df6606c180a3e370fef94: Status 404 returned error can't find the container with id 50dc2b64957657973595ac2fe0f873643bc0931db98df6606c180a3e370fef94 Nov 22 02:54:18 crc kubenswrapper[4952]: W1122 02:54:18.450807 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4e94605_ee67_4d5b_8396_fbe7f8a1a6e4.slice/crio-e9219c0d1f6dccfa0e9ffca3fcf649f0e495704d81f3b58f002ce4b2da4879a2 WatchSource:0}: Error finding container e9219c0d1f6dccfa0e9ffca3fcf649f0e495704d81f3b58f002ce4b2da4879a2: Status 404 returned error can't find the container with id e9219c0d1f6dccfa0e9ffca3fcf649f0e495704d81f3b58f002ce4b2da4879a2 Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.713853 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-x6nk8" event={"ID":"299b06f8-5ba8-425d-96a5-2866e435b986","Type":"ContainerStarted","Data":"6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c"} Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.713920 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-x6nk8" event={"ID":"299b06f8-5ba8-425d-96a5-2866e435b986","Type":"ContainerStarted","Data":"b7d636c1dab3807ebf96621cf8922ce8ecfc8544a2026dfb0746371d9ccb8f6c"} Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.715329 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j9kg2" event={"ID":"ccedfe81-43b3-4af7-88c7-9953b33e7d13","Type":"ContainerStarted","Data":"6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489"} Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.715389 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j9kg2" event={"ID":"ccedfe81-43b3-4af7-88c7-9953b33e7d13","Type":"ContainerStarted","Data":"34e19879b662201c59d2eb45059411e416ea3a5c3220bbac992dfb180cda352e"} Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.717330 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.719958 4952 scope.go:117] "RemoveContainer" containerID="ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8" Nov 22 02:54:18 crc kubenswrapper[4952]: E1122 02:54:18.720164 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.728394 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3"} Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.728459 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"2537f5dc9060fee78fb7e9ad0a02ad6b8c4cfb44c46349385e81fcc1debf411c"} Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.730823 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" event={"ID":"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4","Type":"ContainerStarted","Data":"e9219c0d1f6dccfa0e9ffca3fcf649f0e495704d81f3b58f002ce4b2da4879a2"} Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.734742 4952 generic.go:334] "Generic (PLEG): container finished" podID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerID="57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903" exitCode=0 Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.735230 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerDied","Data":"57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903"} Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.735365 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerStarted","Data":"50dc2b64957657973595ac2fe0f873643bc0931db98df6606c180a3e370fef94"} Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.735608 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.758864 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.784325 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.809366 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.827876 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.842348 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.876765 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.896440 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.910690 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.924449 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:18 crc kubenswrapper[4952]: I1122 02:54:18.991970 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75161cdda894dd860acef8dd3ac05eff62b25748914a8ae55d9a46ed72c42c71\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:10Z\\\",\\\"message\\\":\\\"W1122 02:53:59.828002 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:53:59.828743 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763780039 cert, and key in /tmp/serving-cert-2201760489/serving-signer.crt, /tmp/serving-cert-2201760489/serving-signer.key\\\\nI1122 02:54:00.060946 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:54:00.063373 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:54:00.063524 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:00.066424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2201760489/tls.crt::/tmp/serving-cert-2201760489/tls.key\\\\\\\"\\\\nF1122 02:54:10.329345 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.023421 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.061401 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.096418 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.146862 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.193367 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.225755 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.246199 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.280441 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.294945 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.308193 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.324710 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.340705 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.354505 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.378280 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.398497 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.420500 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.437163 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.530978 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.531044 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.531085 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:19 crc kubenswrapper[4952]: E1122 02:54:19.531147 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:19 crc kubenswrapper[4952]: E1122 02:54:19.531293 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:19 crc kubenswrapper[4952]: E1122 02:54:19.531529 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.739892 4952 generic.go:334] "Generic (PLEG): container finished" podID="b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4" containerID="e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e" exitCode=0 Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.739992 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" event={"ID":"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4","Type":"ContainerDied","Data":"e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e"} Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.744972 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerStarted","Data":"af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff"} Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.745011 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerStarted","Data":"2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281"} Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.745023 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerStarted","Data":"6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0"} Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.745036 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerStarted","Data":"d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b"} Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.745054 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerStarted","Data":"10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0"} Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.745066 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerStarted","Data":"772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b"} Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.747822 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79"} Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.750742 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779"} Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.770834 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.793031 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.810130 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.829026 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.844753 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.862591 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.883653 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.901240 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.923074 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.935991 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.948241 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.960882 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.971471 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.981150 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:19 crc kubenswrapper[4952]: I1122 02:54:19.993126 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.005372 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.025041 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.038930 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.055177 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.068609 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.082700 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.099982 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.101383 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:54:20 crc kubenswrapper[4952]: E1122 02:54:20.101669 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:54:24.101630191 +0000 UTC m=+28.407647474 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.101746 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.101863 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.101909 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.101937 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:20 crc kubenswrapper[4952]: E1122 02:54:20.101936 4952 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:54:20 crc kubenswrapper[4952]: E1122 02:54:20.102028 4952 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:54:20 crc kubenswrapper[4952]: E1122 02:54:20.102075 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:54:20 crc kubenswrapper[4952]: E1122 02:54:20.102099 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:54:20 crc kubenswrapper[4952]: E1122 02:54:20.102117 4952 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:20 crc kubenswrapper[4952]: E1122 02:54:20.102145 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:54:20 crc kubenswrapper[4952]: E1122 02:54:20.102160 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:54:20 crc kubenswrapper[4952]: E1122 02:54:20.102038 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:24.10200235 +0000 UTC m=+28.408019613 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:54:20 crc kubenswrapper[4952]: E1122 02:54:20.102172 4952 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:20 crc kubenswrapper[4952]: E1122 02:54:20.102191 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:24.102175974 +0000 UTC m=+28.408193467 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:54:20 crc kubenswrapper[4952]: E1122 02:54:20.102207 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:24.102199205 +0000 UTC m=+28.408216708 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:20 crc kubenswrapper[4952]: E1122 02:54:20.102220 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:24.102213215 +0000 UTC m=+28.408230728 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.113750 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.124204 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.145652 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.176968 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.190482 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.201296 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.756843 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" event={"ID":"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4","Type":"ContainerStarted","Data":"150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579"} Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.782299 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.805310 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.826162 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.839367 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-7wlpk"] Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.839827 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7wlpk" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.841509 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.842425 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.842620 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.842656 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.857155 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.870367 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.881837 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.893934 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.905808 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.912031 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9505980-28b9-46e1-85b2-ade5d1684ee7-host\") pod \"node-ca-7wlpk\" (UID: \"f9505980-28b9-46e1-85b2-ade5d1684ee7\") " pod="openshift-image-registry/node-ca-7wlpk" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.912119 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r9vv\" (UniqueName: \"kubernetes.io/projected/f9505980-28b9-46e1-85b2-ade5d1684ee7-kube-api-access-9r9vv\") pod \"node-ca-7wlpk\" (UID: \"f9505980-28b9-46e1-85b2-ade5d1684ee7\") " pod="openshift-image-registry/node-ca-7wlpk" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.912191 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f9505980-28b9-46e1-85b2-ade5d1684ee7-serviceca\") pod \"node-ca-7wlpk\" (UID: \"f9505980-28b9-46e1-85b2-ade5d1684ee7\") " pod="openshift-image-registry/node-ca-7wlpk" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.924472 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.936768 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.950056 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.961023 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.973626 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:20 crc kubenswrapper[4952]: I1122 02:54:20.985214 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.007513 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.012887 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f9505980-28b9-46e1-85b2-ade5d1684ee7-serviceca\") pod \"node-ca-7wlpk\" (UID: \"f9505980-28b9-46e1-85b2-ade5d1684ee7\") " pod="openshift-image-registry/node-ca-7wlpk" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.012937 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9505980-28b9-46e1-85b2-ade5d1684ee7-host\") pod \"node-ca-7wlpk\" (UID: \"f9505980-28b9-46e1-85b2-ade5d1684ee7\") " pod="openshift-image-registry/node-ca-7wlpk" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.012969 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r9vv\" (UniqueName: \"kubernetes.io/projected/f9505980-28b9-46e1-85b2-ade5d1684ee7-kube-api-access-9r9vv\") pod \"node-ca-7wlpk\" (UID: \"f9505980-28b9-46e1-85b2-ade5d1684ee7\") " pod="openshift-image-registry/node-ca-7wlpk" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.013119 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9505980-28b9-46e1-85b2-ade5d1684ee7-host\") pod \"node-ca-7wlpk\" (UID: \"f9505980-28b9-46e1-85b2-ade5d1684ee7\") " pod="openshift-image-registry/node-ca-7wlpk" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.013786 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f9505980-28b9-46e1-85b2-ade5d1684ee7-serviceca\") pod \"node-ca-7wlpk\" (UID: \"f9505980-28b9-46e1-85b2-ade5d1684ee7\") " pod="openshift-image-registry/node-ca-7wlpk" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.019578 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.030265 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.032623 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r9vv\" (UniqueName: \"kubernetes.io/projected/f9505980-28b9-46e1-85b2-ade5d1684ee7-kube-api-access-9r9vv\") pod \"node-ca-7wlpk\" (UID: \"f9505980-28b9-46e1-85b2-ade5d1684ee7\") " pod="openshift-image-registry/node-ca-7wlpk" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.041653 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.056019 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.068032 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.107061 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.144376 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.154671 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7wlpk" Nov 22 02:54:21 crc kubenswrapper[4952]: W1122 02:54:21.166853 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9505980_28b9_46e1_85b2_ade5d1684ee7.slice/crio-59df414359e0fc772ac87239dc3381db6db7700aca74c64e8ed42bc071426f23 WatchSource:0}: Error finding container 59df414359e0fc772ac87239dc3381db6db7700aca74c64e8ed42bc071426f23: Status 404 returned error can't find the container with id 59df414359e0fc772ac87239dc3381db6db7700aca74c64e8ed42bc071426f23 Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.186809 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.207375 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.220532 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.234645 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.258328 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.274685 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.287877 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.530816 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.530878 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.530926 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:21 crc kubenswrapper[4952]: E1122 02:54:21.530974 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:21 crc kubenswrapper[4952]: E1122 02:54:21.531104 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:21 crc kubenswrapper[4952]: E1122 02:54:21.531360 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.769357 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerStarted","Data":"3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de"} Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.773305 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7wlpk" event={"ID":"f9505980-28b9-46e1-85b2-ade5d1684ee7","Type":"ContainerStarted","Data":"231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24"} Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.773345 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7wlpk" event={"ID":"f9505980-28b9-46e1-85b2-ade5d1684ee7","Type":"ContainerStarted","Data":"59df414359e0fc772ac87239dc3381db6db7700aca74c64e8ed42bc071426f23"} Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.776490 4952 generic.go:334] "Generic (PLEG): container finished" podID="b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4" containerID="150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579" exitCode=0 Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.776585 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" event={"ID":"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4","Type":"ContainerDied","Data":"150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579"} Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.800466 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.824024 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.844889 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.861608 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.874041 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.890529 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.908460 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.922533 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.934428 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.954240 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.969734 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:21 crc kubenswrapper[4952]: I1122 02:54:21.988942 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.002013 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.017613 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.029356 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.043521 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.057294 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.071425 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.105604 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.119771 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.132405 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.147003 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.162385 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.187893 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.205518 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.234676 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.250128 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.270041 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.287800 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.306067 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.381452 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.382308 4952 scope.go:117] "RemoveContainer" containerID="ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8" Nov 22 02:54:22 crc kubenswrapper[4952]: E1122 02:54:22.382664 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.745008 4952 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.747354 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.747410 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.747427 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.747588 4952 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.754912 4952 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.755645 4952 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.758497 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.758574 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.758597 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.758625 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.758640 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:22Z","lastTransitionTime":"2025-11-22T02:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:22 crc kubenswrapper[4952]: E1122 02:54:22.775772 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.781139 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.781223 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.781236 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.781259 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.781272 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:22Z","lastTransitionTime":"2025-11-22T02:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.786049 4952 generic.go:334] "Generic (PLEG): container finished" podID="b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4" containerID="2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e" exitCode=0 Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.786141 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" event={"ID":"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4","Type":"ContainerDied","Data":"2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e"} Nov 22 02:54:22 crc kubenswrapper[4952]: E1122 02:54:22.800715 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.802415 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.806613 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.806654 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.806667 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.806688 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.806703 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:22Z","lastTransitionTime":"2025-11-22T02:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.826564 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: E1122 02:54:22.830528 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.834892 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.834946 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.834963 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.834989 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.835014 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:22Z","lastTransitionTime":"2025-11-22T02:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.852629 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: E1122 02:54:22.854380 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.858036 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.858076 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.858089 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.858109 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.858125 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:22Z","lastTransitionTime":"2025-11-22T02:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:22 crc kubenswrapper[4952]: E1122 02:54:22.869971 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: E1122 02:54:22.870132 4952 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.871366 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.874948 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.875006 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.875019 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.875037 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.875049 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:22Z","lastTransitionTime":"2025-11-22T02:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.887778 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.901960 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.918916 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.935935 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.965616 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.985002 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.987903 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.987976 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.988001 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.988037 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:22 crc kubenswrapper[4952]: I1122 02:54:22.988062 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:22Z","lastTransitionTime":"2025-11-22T02:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.007683 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.023780 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.037509 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.052684 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.067702 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.091289 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.091327 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.091339 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.091357 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.091370 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:23Z","lastTransitionTime":"2025-11-22T02:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.195770 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.195828 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.195841 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.195864 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.195880 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:23Z","lastTransitionTime":"2025-11-22T02:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.301489 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.301618 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.301637 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.301661 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.301678 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:23Z","lastTransitionTime":"2025-11-22T02:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.405188 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.405246 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.405262 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.405288 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.405309 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:23Z","lastTransitionTime":"2025-11-22T02:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.507868 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.507907 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.507916 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.507930 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.507941 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:23Z","lastTransitionTime":"2025-11-22T02:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.531129 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.531194 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.531354 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:23 crc kubenswrapper[4952]: E1122 02:54:23.531514 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:23 crc kubenswrapper[4952]: E1122 02:54:23.531787 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:23 crc kubenswrapper[4952]: E1122 02:54:23.531917 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.611741 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.611794 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.611807 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.611827 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.611842 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:23Z","lastTransitionTime":"2025-11-22T02:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.715296 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.715407 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.715427 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.715451 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.715469 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:23Z","lastTransitionTime":"2025-11-22T02:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.794701 4952 generic.go:334] "Generic (PLEG): container finished" podID="b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4" containerID="feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b" exitCode=0 Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.794775 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" event={"ID":"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4","Type":"ContainerDied","Data":"feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b"} Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.818460 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.818506 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.818519 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.818553 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.818577 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:23Z","lastTransitionTime":"2025-11-22T02:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.865142 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.883141 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.899320 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.912842 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.921218 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.921261 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.921276 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.921295 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.921307 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:23Z","lastTransitionTime":"2025-11-22T02:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.927982 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.948415 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.966912 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:23 crc kubenswrapper[4952]: I1122 02:54:23.984913 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.002291 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.025105 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.025165 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.025186 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.025219 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.025238 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:24Z","lastTransitionTime":"2025-11-22T02:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.028180 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.044225 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.063510 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.078915 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.092611 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.107092 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.128155 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.128228 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.128252 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.128283 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.128307 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:24Z","lastTransitionTime":"2025-11-22T02:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.145769 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.145900 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.145934 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.145954 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.145972 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:24 crc kubenswrapper[4952]: E1122 02:54:24.146067 4952 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:54:24 crc kubenswrapper[4952]: E1122 02:54:24.146110 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:32.146097567 +0000 UTC m=+36.452114840 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:54:24 crc kubenswrapper[4952]: E1122 02:54:24.146163 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:54:32.146156458 +0000 UTC m=+36.452173731 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:54:24 crc kubenswrapper[4952]: E1122 02:54:24.146222 4952 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:54:24 crc kubenswrapper[4952]: E1122 02:54:24.146241 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:32.14623544 +0000 UTC m=+36.452252713 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:54:24 crc kubenswrapper[4952]: E1122 02:54:24.146291 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:54:24 crc kubenswrapper[4952]: E1122 02:54:24.146300 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:54:24 crc kubenswrapper[4952]: E1122 02:54:24.146310 4952 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:24 crc kubenswrapper[4952]: E1122 02:54:24.146328 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:32.146322882 +0000 UTC m=+36.452340155 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:24 crc kubenswrapper[4952]: E1122 02:54:24.146366 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:54:24 crc kubenswrapper[4952]: E1122 02:54:24.146374 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:54:24 crc kubenswrapper[4952]: E1122 02:54:24.146380 4952 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:24 crc kubenswrapper[4952]: E1122 02:54:24.146398 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:32.146392854 +0000 UTC m=+36.452410127 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.231472 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.231581 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.231607 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.231645 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.231673 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:24Z","lastTransitionTime":"2025-11-22T02:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.334145 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.334606 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.334624 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.334644 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.334657 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:24Z","lastTransitionTime":"2025-11-22T02:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.436823 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.436869 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.436883 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.436906 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.436917 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:24Z","lastTransitionTime":"2025-11-22T02:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.538834 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.538879 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.538889 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.538903 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.538913 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:24Z","lastTransitionTime":"2025-11-22T02:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.642537 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.642607 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.642621 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.642640 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.642653 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:24Z","lastTransitionTime":"2025-11-22T02:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.747044 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.747085 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.747094 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.747108 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.747118 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:24Z","lastTransitionTime":"2025-11-22T02:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.802826 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" event={"ID":"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4","Type":"ContainerStarted","Data":"0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623"} Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.810205 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerStarted","Data":"97bf92abda749465e7cd710a2a793549f0d51b717aa4afea2961a71eed64b6eb"} Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.810820 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.840027 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.845275 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.849788 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.849860 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.849874 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.849891 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.849905 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:24Z","lastTransitionTime":"2025-11-22T02:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.859725 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.882459 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.900683 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.922646 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.947195 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.953104 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.953158 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.953174 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.953199 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.953219 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:24Z","lastTransitionTime":"2025-11-22T02:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.967508 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:24 crc kubenswrapper[4952]: I1122 02:54:24.986187 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.015090 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.043333 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.057058 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.057120 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.057140 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.057166 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.057185 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:25Z","lastTransitionTime":"2025-11-22T02:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.070969 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.089614 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.110909 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.127386 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.144971 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.160275 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.160337 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.160354 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.160394 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.160447 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:25Z","lastTransitionTime":"2025-11-22T02:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.171786 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.193998 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.218077 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.234735 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.251426 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.263426 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.263484 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.263503 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.263527 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.263586 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:25Z","lastTransitionTime":"2025-11-22T02:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.289389 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.310304 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.338949 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.356749 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.366128 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.366182 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.366205 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.366237 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.366260 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:25Z","lastTransitionTime":"2025-11-22T02:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.376660 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.399402 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.424842 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.447587 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.469198 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.469271 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.469295 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.469349 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.469373 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:25Z","lastTransitionTime":"2025-11-22T02:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.471391 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.501809 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97bf92abda749465e7cd710a2a793549f0d51b717aa4afea2961a71eed64b6eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.531000 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.531031 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.531079 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:25 crc kubenswrapper[4952]: E1122 02:54:25.531196 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:25 crc kubenswrapper[4952]: E1122 02:54:25.531325 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:25 crc kubenswrapper[4952]: E1122 02:54:25.531457 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.573007 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.573063 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.573082 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.573108 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.573128 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:25Z","lastTransitionTime":"2025-11-22T02:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.676744 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.676821 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.676842 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.676871 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.676896 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:25Z","lastTransitionTime":"2025-11-22T02:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.781096 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.781163 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.781182 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.781208 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.781228 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:25Z","lastTransitionTime":"2025-11-22T02:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.819465 4952 generic.go:334] "Generic (PLEG): container finished" podID="b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4" containerID="0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623" exitCode=0 Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.819568 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" event={"ID":"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4","Type":"ContainerDied","Data":"0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623"} Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.819773 4952 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.821120 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.846786 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.863439 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.866121 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.884648 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.884706 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.884727 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.884752 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.884773 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:25Z","lastTransitionTime":"2025-11-22T02:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.895390 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97bf92abda749465e7cd710a2a793549f0d51b717aa4afea2961a71eed64b6eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.917110 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.940727 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.956371 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.969067 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.982031 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.986934 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.986965 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.986976 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.986993 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.987009 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:25Z","lastTransitionTime":"2025-11-22T02:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:25 crc kubenswrapper[4952]: I1122 02:54:25.994829 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.009281 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.023026 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.043972 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.058239 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.070082 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.086919 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.090188 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.090253 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.090265 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.090286 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.090299 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:26Z","lastTransitionTime":"2025-11-22T02:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.102402 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.118080 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.133439 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.148681 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.164553 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.192703 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.192753 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.192771 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.192796 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.192811 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:26Z","lastTransitionTime":"2025-11-22T02:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.199620 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.218087 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.245090 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.266400 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.280870 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.296483 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.296554 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.296569 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.296597 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.296618 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:26Z","lastTransitionTime":"2025-11-22T02:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.311813 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.326622 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.387106 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.402887 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.402948 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.402961 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.402987 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.403001 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:26Z","lastTransitionTime":"2025-11-22T02:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.423914 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97bf92abda749465e7cd710a2a793549f0d51b717aa4afea2961a71eed64b6eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.445882 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.505928 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.505971 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.505981 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.506005 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.506015 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:26Z","lastTransitionTime":"2025-11-22T02:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.545581 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.565157 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.579901 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.592799 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.605194 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.609611 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.609884 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.609969 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.610050 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.610133 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:26Z","lastTransitionTime":"2025-11-22T02:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.628876 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.646919 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.664128 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.682403 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.704635 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.713575 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.713620 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.713635 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.713657 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.713675 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:26Z","lastTransitionTime":"2025-11-22T02:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.719128 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.735123 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.748245 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.767105 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97bf92abda749465e7cd710a2a793549f0d51b717aa4afea2961a71eed64b6eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.785325 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.816813 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.817202 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.817390 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.817573 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.817720 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:26Z","lastTransitionTime":"2025-11-22T02:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.828045 4952 generic.go:334] "Generic (PLEG): container finished" podID="b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4" containerID="a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae" exitCode=0 Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.828258 4952 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.828423 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" event={"ID":"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4","Type":"ContainerDied","Data":"a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae"} Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.853812 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.888089 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.913720 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.921162 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.921214 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.921228 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.921249 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.921263 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:26Z","lastTransitionTime":"2025-11-22T02:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.931598 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.951757 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.966985 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:26 crc kubenswrapper[4952]: I1122 02:54:26.986084 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.005012 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:27Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.019931 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:27Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.025290 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.025348 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.025363 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.025523 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.025647 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:27Z","lastTransitionTime":"2025-11-22T02:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.045213 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97bf92abda749465e7cd710a2a793549f0d51b717aa4afea2961a71eed64b6eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:27Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.061614 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:27Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.075881 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:27Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.089280 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:27Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.100911 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:27Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.112988 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:27Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.129019 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.129053 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.129064 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.129084 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.129099 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:27Z","lastTransitionTime":"2025-11-22T02:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.232898 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.232964 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.232982 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.233008 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.233027 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:27Z","lastTransitionTime":"2025-11-22T02:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.336394 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.336469 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.336493 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.336524 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.336582 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:27Z","lastTransitionTime":"2025-11-22T02:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.439743 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.439797 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.439814 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.439845 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.439861 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:27Z","lastTransitionTime":"2025-11-22T02:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.530756 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.530797 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.530954 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:27 crc kubenswrapper[4952]: E1122 02:54:27.531125 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:27 crc kubenswrapper[4952]: E1122 02:54:27.531920 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:27 crc kubenswrapper[4952]: E1122 02:54:27.532012 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.543502 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.543604 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.543626 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.543651 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.543671 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:27Z","lastTransitionTime":"2025-11-22T02:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.647129 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.647192 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.647210 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.647233 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.647252 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:27Z","lastTransitionTime":"2025-11-22T02:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.750343 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.750418 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.750437 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.750463 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.750484 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:27Z","lastTransitionTime":"2025-11-22T02:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.838659 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" event={"ID":"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4","Type":"ContainerStarted","Data":"7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937"} Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.841583 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnw6b_bef051cd-2285-4b6b-a16f-1154f4d1f5dd/ovnkube-controller/0.log" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.851491 4952 generic.go:334] "Generic (PLEG): container finished" podID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerID="97bf92abda749465e7cd710a2a793549f0d51b717aa4afea2961a71eed64b6eb" exitCode=1 Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.851591 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerDied","Data":"97bf92abda749465e7cd710a2a793549f0d51b717aa4afea2961a71eed64b6eb"} Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.852675 4952 scope.go:117] "RemoveContainer" containerID="97bf92abda749465e7cd710a2a793549f0d51b717aa4afea2961a71eed64b6eb" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.852896 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.852959 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.852986 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.853015 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.853036 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:27Z","lastTransitionTime":"2025-11-22T02:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.860747 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:27Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.900006 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:27Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.922040 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:27Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.942098 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:27Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.959959 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.960000 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.960012 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.960029 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.960042 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:27Z","lastTransitionTime":"2025-11-22T02:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.962093 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:27Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:27 crc kubenswrapper[4952]: I1122 02:54:27.980231 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:27Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.000886 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:27Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.032447 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97bf92abda749465e7cd710a2a793549f0d51b717aa4afea2961a71eed64b6eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.054202 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.062835 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.062885 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.062901 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.062923 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.062939 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:28Z","lastTransitionTime":"2025-11-22T02:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.073727 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.088265 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.099740 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.111290 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.126565 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.142592 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.155583 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.167369 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.167411 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.167425 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.167448 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.167465 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:28Z","lastTransitionTime":"2025-11-22T02:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.173098 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.194720 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97bf92abda749465e7cd710a2a793549f0d51b717aa4afea2961a71eed64b6eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bf92abda749465e7cd710a2a793549f0d51b717aa4afea2961a71eed64b6eb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"message\\\":\\\"flector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 02:54:27.043697 6229 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 02:54:27.043706 6229 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 02:54:27.043725 6229 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 02:54:27.043775 6229 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 02:54:27.043780 6229 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 02:54:27.043797 6229 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 02:54:27.043801 6229 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1122 02:54:27.043813 6229 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 02:54:27.043822 6229 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 02:54:27.043863 6229 factory.go:656] Stopping watch factory\\\\nI1122 02:54:27.043885 6229 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 02:54:27.043931 6229 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 02:54:27.044199 6229 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.211830 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.228320 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.244043 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.253489 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.278098 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.282990 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.283023 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.283034 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.283051 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.283066 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:28Z","lastTransitionTime":"2025-11-22T02:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.323280 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.348044 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.366447 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.385929 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.385972 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.385981 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.385999 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.386009 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:28Z","lastTransitionTime":"2025-11-22T02:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.389360 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.412587 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.469356 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.484045 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.489380 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.489442 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.489457 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.489479 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.489493 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:28Z","lastTransitionTime":"2025-11-22T02:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.592870 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.592938 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.592953 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.592974 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.592987 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:28Z","lastTransitionTime":"2025-11-22T02:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.696536 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.696599 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.696613 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.696633 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.696647 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:28Z","lastTransitionTime":"2025-11-22T02:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.799706 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.799751 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.799767 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.799786 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.799799 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:28Z","lastTransitionTime":"2025-11-22T02:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.857651 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnw6b_bef051cd-2285-4b6b-a16f-1154f4d1f5dd/ovnkube-controller/0.log" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.861329 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerStarted","Data":"f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579"} Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.861392 4952 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.880021 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.896903 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.903450 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.903517 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.903575 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.903609 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.903634 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:28Z","lastTransitionTime":"2025-11-22T02:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.922608 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.943798 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.973231 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bf92abda749465e7cd710a2a793549f0d51b717aa4afea2961a71eed64b6eb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"message\\\":\\\"flector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 02:54:27.043697 6229 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 02:54:27.043706 6229 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 02:54:27.043725 6229 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 02:54:27.043775 6229 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 02:54:27.043780 6229 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 02:54:27.043797 6229 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 02:54:27.043801 6229 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1122 02:54:27.043813 6229 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 02:54:27.043822 6229 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 02:54:27.043863 6229 factory.go:656] Stopping watch factory\\\\nI1122 02:54:27.043885 6229 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 02:54:27.043931 6229 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 02:54:27.044199 6229 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:28 crc kubenswrapper[4952]: I1122 02:54:28.992678 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:28Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.007215 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.007439 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.007524 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.007620 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.007723 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:29Z","lastTransitionTime":"2025-11-22T02:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.010816 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:29Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.027793 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:29Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.044756 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:29Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.061523 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:29Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.075224 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:29Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.102223 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:29Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.110719 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.110868 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.110964 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.111035 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.111093 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:29Z","lastTransitionTime":"2025-11-22T02:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.125234 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:29Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.149896 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:29Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.170481 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:29Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.216385 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.216472 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.216494 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.216525 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.216597 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:29Z","lastTransitionTime":"2025-11-22T02:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.320271 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.320346 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.320368 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.320397 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.320417 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:29Z","lastTransitionTime":"2025-11-22T02:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.423449 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.423499 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.423521 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.423579 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.423605 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:29Z","lastTransitionTime":"2025-11-22T02:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.525614 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.525720 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.525735 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.525754 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.525780 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:29Z","lastTransitionTime":"2025-11-22T02:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.530185 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.530231 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.530290 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:29 crc kubenswrapper[4952]: E1122 02:54:29.530299 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:29 crc kubenswrapper[4952]: E1122 02:54:29.530358 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:29 crc kubenswrapper[4952]: E1122 02:54:29.530416 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.629713 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.629780 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.629800 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.629824 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.629842 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:29Z","lastTransitionTime":"2025-11-22T02:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.737246 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.737328 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.737351 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.737381 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.737405 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:29Z","lastTransitionTime":"2025-11-22T02:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.840405 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.840514 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.840533 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.840614 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.840686 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:29Z","lastTransitionTime":"2025-11-22T02:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.866932 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnw6b_bef051cd-2285-4b6b-a16f-1154f4d1f5dd/ovnkube-controller/1.log" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.867807 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnw6b_bef051cd-2285-4b6b-a16f-1154f4d1f5dd/ovnkube-controller/0.log" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.871396 4952 generic.go:334] "Generic (PLEG): container finished" podID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerID="f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579" exitCode=1 Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.871455 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerDied","Data":"f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579"} Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.871529 4952 scope.go:117] "RemoveContainer" containerID="97bf92abda749465e7cd710a2a793549f0d51b717aa4afea2961a71eed64b6eb" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.872754 4952 scope.go:117] "RemoveContainer" containerID="f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579" Nov 22 02:54:29 crc kubenswrapper[4952]: E1122 02:54:29.873039 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.899991 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:29Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.921468 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:29Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.942090 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:29Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.945484 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.945569 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.945591 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.945618 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.945636 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:29Z","lastTransitionTime":"2025-11-22T02:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.959360 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:29Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:29 crc kubenswrapper[4952]: I1122 02:54:29.977501 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:29Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.011009 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:30Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.029873 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:30Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.049569 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.049781 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.049921 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.050058 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.050175 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:30Z","lastTransitionTime":"2025-11-22T02:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.050851 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:30Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.070221 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:30Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.092126 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:30Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.113807 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:30Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.138614 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:30Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.153124 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.153176 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.153193 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.153220 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.153237 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:30Z","lastTransitionTime":"2025-11-22T02:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.161235 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:30Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.197260 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97bf92abda749465e7cd710a2a793549f0d51b717aa4afea2961a71eed64b6eb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"message\\\":\\\"flector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 02:54:27.043697 6229 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 02:54:27.043706 6229 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 02:54:27.043725 6229 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 02:54:27.043775 6229 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 02:54:27.043780 6229 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 02:54:27.043797 6229 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 02:54:27.043801 6229 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1122 02:54:27.043813 6229 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 02:54:27.043822 6229 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 02:54:27.043863 6229 factory.go:656] Stopping watch factory\\\\nI1122 02:54:27.043885 6229 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 02:54:27.043931 6229 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 02:54:27.044199 6229 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:28Z\\\",\\\"message\\\":\\\"0.0.1:29103\\\\\\\"\\\\nI1122 02:54:28.913401 6417 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1122 02:54:28.912936 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1122 02:54:28.913425 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nI1122 02:54:28.913464 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-7wlpk\\\\nI1122 02:54:28.913479 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nF1122 02:54:28.913480 6417 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:30Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.223885 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:30Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.256408 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.256782 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.256921 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.257104 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.257237 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:30Z","lastTransitionTime":"2025-11-22T02:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.360431 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.360495 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.360518 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.360588 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.360615 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:30Z","lastTransitionTime":"2025-11-22T02:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.464740 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.464799 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.464813 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.464833 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.464849 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:30Z","lastTransitionTime":"2025-11-22T02:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.567914 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.567952 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.567964 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.567978 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.567989 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:30Z","lastTransitionTime":"2025-11-22T02:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.671041 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.671426 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.671633 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.671808 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.671970 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:30Z","lastTransitionTime":"2025-11-22T02:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.775135 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.775195 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.775217 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.775239 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.775253 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:30Z","lastTransitionTime":"2025-11-22T02:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.877591 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.877621 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnw6b_bef051cd-2285-4b6b-a16f-1154f4d1f5dd/ovnkube-controller/1.log" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.877656 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.877675 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.877701 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.877719 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:30Z","lastTransitionTime":"2025-11-22T02:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.882156 4952 scope.go:117] "RemoveContainer" containerID="f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579" Nov 22 02:54:30 crc kubenswrapper[4952]: E1122 02:54:30.882437 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.902753 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:30Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.920227 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:30Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.931220 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:30Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.945061 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:30Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.965129 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:30Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.980443 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.980516 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.980533 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.980593 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.980611 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:30Z","lastTransitionTime":"2025-11-22T02:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.981598 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:30Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:30 crc kubenswrapper[4952]: I1122 02:54:30.999832 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:30Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.017821 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.048518 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.064311 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.077134 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.083155 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.083219 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.083238 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.083266 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.083289 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:31Z","lastTransitionTime":"2025-11-22T02:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.099085 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.116621 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.146363 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:28Z\\\",\\\"message\\\":\\\"0.0.1:29103\\\\\\\"\\\\nI1122 02:54:28.913401 6417 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1122 02:54:28.912936 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1122 02:54:28.913425 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nI1122 02:54:28.913464 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-7wlpk\\\\nI1122 02:54:28.913479 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nF1122 02:54:28.913480 6417 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.169666 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.186619 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.186662 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.186674 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.186691 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.186704 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:31Z","lastTransitionTime":"2025-11-22T02:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.290214 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.290250 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.290260 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.290277 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.290287 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:31Z","lastTransitionTime":"2025-11-22T02:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.393575 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.393661 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.393681 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.393710 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.393731 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:31Z","lastTransitionTime":"2025-11-22T02:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.497160 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.497247 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.497275 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.497304 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.497325 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:31Z","lastTransitionTime":"2025-11-22T02:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.531070 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.531152 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.531201 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:31 crc kubenswrapper[4952]: E1122 02:54:31.531304 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:31 crc kubenswrapper[4952]: E1122 02:54:31.531475 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:31 crc kubenswrapper[4952]: E1122 02:54:31.531737 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.599746 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.599834 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.599861 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.599891 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.599917 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:31Z","lastTransitionTime":"2025-11-22T02:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.607255 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n"] Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.607991 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.610779 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.611474 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.631688 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.640955 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5f1869ba-6fff-4b0d-9e45-1e2aac293caa-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jkv7n\" (UID: \"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.641051 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5f1869ba-6fff-4b0d-9e45-1e2aac293caa-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jkv7n\" (UID: \"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.641112 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh6x6\" (UniqueName: \"kubernetes.io/projected/5f1869ba-6fff-4b0d-9e45-1e2aac293caa-kube-api-access-xh6x6\") pod \"ovnkube-control-plane-749d76644c-jkv7n\" (UID: \"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.641405 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5f1869ba-6fff-4b0d-9e45-1e2aac293caa-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jkv7n\" (UID: \"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.663380 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.679787 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.694923 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.703827 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.703861 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.703873 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.703892 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.703905 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:31Z","lastTransitionTime":"2025-11-22T02:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.717216 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.739615 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.743261 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5f1869ba-6fff-4b0d-9e45-1e2aac293caa-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jkv7n\" (UID: \"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.743365 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5f1869ba-6fff-4b0d-9e45-1e2aac293caa-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jkv7n\" (UID: \"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.743405 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh6x6\" (UniqueName: \"kubernetes.io/projected/5f1869ba-6fff-4b0d-9e45-1e2aac293caa-kube-api-access-xh6x6\") pod \"ovnkube-control-plane-749d76644c-jkv7n\" (UID: \"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.743465 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5f1869ba-6fff-4b0d-9e45-1e2aac293caa-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jkv7n\" (UID: \"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.744187 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5f1869ba-6fff-4b0d-9e45-1e2aac293caa-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jkv7n\" (UID: \"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.744823 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5f1869ba-6fff-4b0d-9e45-1e2aac293caa-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jkv7n\" (UID: \"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.762013 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.762310 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5f1869ba-6fff-4b0d-9e45-1e2aac293caa-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jkv7n\" (UID: \"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.776057 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh6x6\" (UniqueName: \"kubernetes.io/projected/5f1869ba-6fff-4b0d-9e45-1e2aac293caa-kube-api-access-xh6x6\") pod \"ovnkube-control-plane-749d76644c-jkv7n\" (UID: \"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.778422 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.788793 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jkv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.807334 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.807414 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.807443 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.807478 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.807500 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:31Z","lastTransitionTime":"2025-11-22T02:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.810884 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.827211 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.846978 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.873414 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.890536 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.910140 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.910207 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.910232 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.910264 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.910282 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:31Z","lastTransitionTime":"2025-11-22T02:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.916392 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:28Z\\\",\\\"message\\\":\\\"0.0.1:29103\\\\\\\"\\\\nI1122 02:54:28.913401 6417 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1122 02:54:28.912936 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1122 02:54:28.913425 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nI1122 02:54:28.913464 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-7wlpk\\\\nI1122 02:54:28.913479 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nF1122 02:54:28.913480 6417 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.929873 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" Nov 22 02:54:31 crc kubenswrapper[4952]: I1122 02:54:31.941841 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:31Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:31 crc kubenswrapper[4952]: W1122 02:54:31.957601 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f1869ba_6fff_4b0d_9e45_1e2aac293caa.slice/crio-499a404fe9de31d25b4475910dc4c128afbd090f9bd761a3a33f473d894f8388 WatchSource:0}: Error finding container 499a404fe9de31d25b4475910dc4c128afbd090f9bd761a3a33f473d894f8388: Status 404 returned error can't find the container with id 499a404fe9de31d25b4475910dc4c128afbd090f9bd761a3a33f473d894f8388 Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.014323 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.014388 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.014407 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.014432 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.014450 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:32Z","lastTransitionTime":"2025-11-22T02:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.117862 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.117920 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.117933 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.117958 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.117971 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:32Z","lastTransitionTime":"2025-11-22T02:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.148778 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.148984 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:32 crc kubenswrapper[4952]: E1122 02:54:32.149024 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:54:48.148971859 +0000 UTC m=+52.454989172 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.149057 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.149100 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.149139 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:32 crc kubenswrapper[4952]: E1122 02:54:32.149224 4952 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:54:32 crc kubenswrapper[4952]: E1122 02:54:32.149240 4952 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:54:32 crc kubenswrapper[4952]: E1122 02:54:32.149309 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:54:32 crc kubenswrapper[4952]: E1122 02:54:32.149329 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:54:32 crc kubenswrapper[4952]: E1122 02:54:32.149338 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:48.149311219 +0000 UTC m=+52.455328532 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:54:32 crc kubenswrapper[4952]: E1122 02:54:32.149370 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:48.14935714 +0000 UTC m=+52.455374453 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:54:32 crc kubenswrapper[4952]: E1122 02:54:32.149368 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:54:32 crc kubenswrapper[4952]: E1122 02:54:32.149416 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:54:32 crc kubenswrapper[4952]: E1122 02:54:32.149441 4952 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:32 crc kubenswrapper[4952]: E1122 02:54:32.149522 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:48.149492163 +0000 UTC m=+52.455509466 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:32 crc kubenswrapper[4952]: E1122 02:54:32.149344 4952 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:32 crc kubenswrapper[4952]: E1122 02:54:32.149637 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:48.149624366 +0000 UTC m=+52.455641649 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.223976 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.224050 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.224066 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.224088 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.224105 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:32Z","lastTransitionTime":"2025-11-22T02:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.327256 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.327327 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.327351 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.327384 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.327406 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:32Z","lastTransitionTime":"2025-11-22T02:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.430342 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.430711 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.430888 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.431030 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.431192 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:32Z","lastTransitionTime":"2025-11-22T02:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.533070 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.533113 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.533123 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.533137 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.533148 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:32Z","lastTransitionTime":"2025-11-22T02:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.636440 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.636504 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.636520 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.636573 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.636589 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:32Z","lastTransitionTime":"2025-11-22T02:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.739184 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.739431 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.739639 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.739801 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.739961 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:32Z","lastTransitionTime":"2025-11-22T02:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.781835 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-gkngm"] Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.783026 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:32 crc kubenswrapper[4952]: E1122 02:54:32.783281 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.806340 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:32Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.835348 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:32Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.844374 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.844463 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.844488 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.844523 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.844587 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:32Z","lastTransitionTime":"2025-11-22T02:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.858177 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs\") pod \"network-metrics-daemon-gkngm\" (UID: \"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\") " pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.858230 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt95v\" (UniqueName: \"kubernetes.io/projected/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-kube-api-access-zt95v\") pod \"network-metrics-daemon-gkngm\" (UID: \"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\") " pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.862816 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:32Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.886866 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:32Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.893027 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" event={"ID":"5f1869ba-6fff-4b0d-9e45-1e2aac293caa","Type":"ContainerStarted","Data":"f9b96da655c6ec124324a850442d9bd6deffb8e111f4882435404258f0e1351f"} Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.896764 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" event={"ID":"5f1869ba-6fff-4b0d-9e45-1e2aac293caa","Type":"ContainerStarted","Data":"0eabf937d41275fd8b68da6fbe05fe8fc415fe89e3fed41cee305a00750bd4d9"} Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.896928 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" event={"ID":"5f1869ba-6fff-4b0d-9e45-1e2aac293caa","Type":"ContainerStarted","Data":"499a404fe9de31d25b4475910dc4c128afbd090f9bd761a3a33f473d894f8388"} Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.908417 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:32Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.932935 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:28Z\\\",\\\"message\\\":\\\"0.0.1:29103\\\\\\\"\\\\nI1122 02:54:28.913401 6417 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1122 02:54:28.912936 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1122 02:54:28.913425 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nI1122 02:54:28.913464 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-7wlpk\\\\nI1122 02:54:28.913479 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nF1122 02:54:28.913480 6417 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:32Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.947601 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.947701 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.947725 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.947753 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.947771 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:32Z","lastTransitionTime":"2025-11-22T02:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.948468 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gkngm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gkngm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:32Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.959446 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt95v\" (UniqueName: \"kubernetes.io/projected/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-kube-api-access-zt95v\") pod \"network-metrics-daemon-gkngm\" (UID: \"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\") " pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.959663 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs\") pod \"network-metrics-daemon-gkngm\" (UID: \"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\") " pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:32 crc kubenswrapper[4952]: E1122 02:54:32.959854 4952 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:54:32 crc kubenswrapper[4952]: E1122 02:54:32.959959 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs podName:c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc nodeName:}" failed. No retries permitted until 2025-11-22 02:54:33.459933858 +0000 UTC m=+37.765951171 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs") pod "network-metrics-daemon-gkngm" (UID: "c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.964023 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:32Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.989740 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:32Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:32 crc kubenswrapper[4952]: I1122 02:54:32.995117 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt95v\" (UniqueName: \"kubernetes.io/projected/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-kube-api-access-zt95v\") pod \"network-metrics-daemon-gkngm\" (UID: \"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\") " pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.006074 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.020147 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.036285 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.046748 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.046957 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.047099 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.047352 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.047609 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:33Z","lastTransitionTime":"2025-11-22T02:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.062845 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: E1122 02:54:33.068374 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.074445 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.074771 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.075010 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.075193 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.075350 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:33Z","lastTransitionTime":"2025-11-22T02:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.095135 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: E1122 02:54:33.095303 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.101993 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.102046 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.102058 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.102082 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.102097 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:33Z","lastTransitionTime":"2025-11-22T02:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.116056 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: E1122 02:54:33.122316 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.128612 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.128676 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.128698 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.128728 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.128741 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:33Z","lastTransitionTime":"2025-11-22T02:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.136727 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: E1122 02:54:33.143253 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.148493 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.148567 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.148585 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.148611 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.148629 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:33Z","lastTransitionTime":"2025-11-22T02:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.152201 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jkv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: E1122 02:54:33.164619 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: E1122 02:54:33.164734 4952 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.166985 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.167037 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.167051 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.167069 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.167078 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:33Z","lastTransitionTime":"2025-11-22T02:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.167909 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.180778 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.199604 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:28Z\\\",\\\"message\\\":\\\"0.0.1:29103\\\\\\\"\\\\nI1122 02:54:28.913401 6417 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1122 02:54:28.912936 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1122 02:54:28.913425 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nI1122 02:54:28.913464 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-7wlpk\\\\nI1122 02:54:28.913479 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nF1122 02:54:28.913480 6417 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.222826 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.236591 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.246556 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.257619 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.269785 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.269828 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.269838 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.269855 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.269866 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:33Z","lastTransitionTime":"2025-11-22T02:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.274221 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gkngm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gkngm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.290807 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.304800 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.322304 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.339806 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.355596 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eabf937d41275fd8b68da6fbe05fe8fc415fe89e3fed41cee305a00750bd4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b96da655c6ec124324a850442d9bd6deffb8e111f4882435404258f0e1351f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jkv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.372252 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.372278 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.372290 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.372306 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.372318 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:33Z","lastTransitionTime":"2025-11-22T02:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.389008 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.409184 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.424658 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.443318 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:33Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.466170 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs\") pod \"network-metrics-daemon-gkngm\" (UID: \"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\") " pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:33 crc kubenswrapper[4952]: E1122 02:54:33.466435 4952 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:54:33 crc kubenswrapper[4952]: E1122 02:54:33.466614 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs podName:c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc nodeName:}" failed. No retries permitted until 2025-11-22 02:54:34.46652448 +0000 UTC m=+38.772541793 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs") pod "network-metrics-daemon-gkngm" (UID: "c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.474671 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.474716 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.474731 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.474752 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.474765 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:33Z","lastTransitionTime":"2025-11-22T02:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.530198 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.530238 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.530213 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:33 crc kubenswrapper[4952]: E1122 02:54:33.530373 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:33 crc kubenswrapper[4952]: E1122 02:54:33.530488 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:33 crc kubenswrapper[4952]: E1122 02:54:33.530632 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.577366 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.577440 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.577466 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.577494 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.577513 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:33Z","lastTransitionTime":"2025-11-22T02:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.682862 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.682938 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.682961 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.682988 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.683012 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:33Z","lastTransitionTime":"2025-11-22T02:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.786900 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.786962 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.786983 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.787012 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.787034 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:33Z","lastTransitionTime":"2025-11-22T02:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.889709 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.889755 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.889769 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.889791 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.889806 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:33Z","lastTransitionTime":"2025-11-22T02:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.992724 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.992773 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.992791 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.992812 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:33 crc kubenswrapper[4952]: I1122 02:54:33.992827 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:33Z","lastTransitionTime":"2025-11-22T02:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.095665 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.095729 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.095747 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.095772 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.095790 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:34Z","lastTransitionTime":"2025-11-22T02:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.198633 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.198732 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.198758 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.198791 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.198822 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:34Z","lastTransitionTime":"2025-11-22T02:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.301937 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.301997 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.302017 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.302049 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.302068 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:34Z","lastTransitionTime":"2025-11-22T02:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.405219 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.405273 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.405294 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.405317 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.405335 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:34Z","lastTransitionTime":"2025-11-22T02:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.478479 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs\") pod \"network-metrics-daemon-gkngm\" (UID: \"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\") " pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:34 crc kubenswrapper[4952]: E1122 02:54:34.478737 4952 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:54:34 crc kubenswrapper[4952]: E1122 02:54:34.478823 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs podName:c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc nodeName:}" failed. No retries permitted until 2025-11-22 02:54:36.478800303 +0000 UTC m=+40.784817606 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs") pod "network-metrics-daemon-gkngm" (UID: "c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.508030 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.508082 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.508100 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.508123 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.508140 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:34Z","lastTransitionTime":"2025-11-22T02:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.531035 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.531040 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:34 crc kubenswrapper[4952]: E1122 02:54:34.531288 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:34 crc kubenswrapper[4952]: E1122 02:54:34.531385 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.610643 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.610724 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.610740 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.610761 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.610773 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:34Z","lastTransitionTime":"2025-11-22T02:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.714187 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.714240 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.714257 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.714278 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.714291 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:34Z","lastTransitionTime":"2025-11-22T02:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.817830 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.817903 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.817920 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.817953 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.818001 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:34Z","lastTransitionTime":"2025-11-22T02:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.921652 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.921730 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.921756 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.921790 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:34 crc kubenswrapper[4952]: I1122 02:54:34.921815 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:34Z","lastTransitionTime":"2025-11-22T02:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.025524 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.025626 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.025640 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.025662 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.025677 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:35Z","lastTransitionTime":"2025-11-22T02:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.128463 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.128923 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.129053 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.129203 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.129335 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:35Z","lastTransitionTime":"2025-11-22T02:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.232751 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.232823 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.232846 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.232878 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.232898 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:35Z","lastTransitionTime":"2025-11-22T02:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.335884 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.336251 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.336443 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.336707 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.336845 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:35Z","lastTransitionTime":"2025-11-22T02:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.440872 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.440978 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.440996 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.441038 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.441057 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:35Z","lastTransitionTime":"2025-11-22T02:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.530652 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:35 crc kubenswrapper[4952]: E1122 02:54:35.530884 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.531208 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:35 crc kubenswrapper[4952]: E1122 02:54:35.531526 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.545321 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.545387 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.545408 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.545438 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.545461 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:35Z","lastTransitionTime":"2025-11-22T02:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.649346 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.649400 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.649418 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.649482 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.649501 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:35Z","lastTransitionTime":"2025-11-22T02:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.753142 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.753255 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.753308 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.753336 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.753354 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:35Z","lastTransitionTime":"2025-11-22T02:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.856867 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.856939 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.856974 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.857007 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.857031 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:35Z","lastTransitionTime":"2025-11-22T02:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.960430 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.960494 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.960515 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.960590 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:35 crc kubenswrapper[4952]: I1122 02:54:35.960620 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:35Z","lastTransitionTime":"2025-11-22T02:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.064574 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.064643 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.064661 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.064685 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.064703 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:36Z","lastTransitionTime":"2025-11-22T02:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.168229 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.168282 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.168296 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.168319 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.168330 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:36Z","lastTransitionTime":"2025-11-22T02:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.272195 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.272266 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.272292 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.272326 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.272356 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:36Z","lastTransitionTime":"2025-11-22T02:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.375378 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.375436 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.375448 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.375475 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.375488 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:36Z","lastTransitionTime":"2025-11-22T02:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.479166 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.479262 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.479281 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.479305 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.479322 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:36Z","lastTransitionTime":"2025-11-22T02:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.505040 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs\") pod \"network-metrics-daemon-gkngm\" (UID: \"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\") " pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:36 crc kubenswrapper[4952]: E1122 02:54:36.505244 4952 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:54:36 crc kubenswrapper[4952]: E1122 02:54:36.505356 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs podName:c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc nodeName:}" failed. No retries permitted until 2025-11-22 02:54:40.505322781 +0000 UTC m=+44.811340084 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs") pod "network-metrics-daemon-gkngm" (UID: "c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.530614 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:36 crc kubenswrapper[4952]: E1122 02:54:36.530923 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.530937 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:36 crc kubenswrapper[4952]: E1122 02:54:36.531710 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.531978 4952 scope.go:117] "RemoveContainer" containerID="ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.553223 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.575778 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.586388 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.586468 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.586486 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.586513 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.586531 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:36Z","lastTransitionTime":"2025-11-22T02:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.596939 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.621022 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.660563 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:28Z\\\",\\\"message\\\":\\\"0.0.1:29103\\\\\\\"\\\\nI1122 02:54:28.913401 6417 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1122 02:54:28.912936 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1122 02:54:28.913425 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nI1122 02:54:28.913464 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-7wlpk\\\\nI1122 02:54:28.913479 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nF1122 02:54:28.913480 6417 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.686002 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.689277 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.689321 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.689334 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.689356 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.689371 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:36Z","lastTransitionTime":"2025-11-22T02:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.705018 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.726332 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.745306 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.762805 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.776766 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.793567 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.793624 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.793639 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.793661 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.793674 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:36Z","lastTransitionTime":"2025-11-22T02:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.794293 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gkngm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gkngm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.817701 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.839111 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.862641 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.875515 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.888845 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eabf937d41275fd8b68da6fbe05fe8fc415fe89e3fed41cee305a00750bd4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b96da655c6ec124324a850442d9bd6deffb8e111f4882435404258f0e1351f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jkv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.896563 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.896607 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.896621 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.896641 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.896653 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:36Z","lastTransitionTime":"2025-11-22T02:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.911898 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.913899 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"83d9cca26b3003c2c5dfada813d6ff241396b642ededd3cb1ec2fed20a4b62bd"} Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.914663 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.939410 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d9cca26b3003c2c5dfada813d6ff241396b642ededd3cb1ec2fed20a4b62bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.956934 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.985612 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:28Z\\\",\\\"message\\\":\\\"0.0.1:29103\\\\\\\"\\\\nI1122 02:54:28.913401 6417 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1122 02:54:28.912936 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1122 02:54:28.913425 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nI1122 02:54:28.913464 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-7wlpk\\\\nI1122 02:54:28.913479 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nF1122 02:54:28.913480 6417 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.999157 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.999349 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.999434 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.999523 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:36 crc kubenswrapper[4952]: I1122 02:54:36.999703 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:36Z","lastTransitionTime":"2025-11-22T02:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.010486 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.029350 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.044867 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.060269 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.074936 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.088156 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.101414 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gkngm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gkngm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.107953 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.108040 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.108054 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.108082 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.108095 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:37Z","lastTransitionTime":"2025-11-22T02:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.140074 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.160385 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.177386 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.193981 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.209733 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eabf937d41275fd8b68da6fbe05fe8fc415fe89e3fed41cee305a00750bd4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b96da655c6ec124324a850442d9bd6deffb8e111f4882435404258f0e1351f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jkv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.211616 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.211691 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.211705 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.211730 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.211747 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:37Z","lastTransitionTime":"2025-11-22T02:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.229407 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.244318 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.314101 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.314155 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.314167 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.314184 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.314196 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:37Z","lastTransitionTime":"2025-11-22T02:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.416468 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.416504 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.416513 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.416526 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.416536 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:37Z","lastTransitionTime":"2025-11-22T02:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.519269 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.519352 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.519371 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.519398 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.519438 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:37Z","lastTransitionTime":"2025-11-22T02:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.530574 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.530892 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:37 crc kubenswrapper[4952]: E1122 02:54:37.531067 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:37 crc kubenswrapper[4952]: E1122 02:54:37.531394 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.622116 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.622172 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.622189 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.622214 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.622233 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:37Z","lastTransitionTime":"2025-11-22T02:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.730183 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.730254 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.730275 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.730306 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.730328 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:37Z","lastTransitionTime":"2025-11-22T02:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.833930 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.834335 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.834532 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.834788 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.835009 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:37Z","lastTransitionTime":"2025-11-22T02:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.938274 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.938371 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.938393 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.938417 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:37 crc kubenswrapper[4952]: I1122 02:54:37.938435 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:37Z","lastTransitionTime":"2025-11-22T02:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.040922 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.040967 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.040977 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.040994 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.041005 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:38Z","lastTransitionTime":"2025-11-22T02:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.144300 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.144363 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.144380 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.144411 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.144428 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:38Z","lastTransitionTime":"2025-11-22T02:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.247930 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.248357 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.248459 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.248603 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.248743 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:38Z","lastTransitionTime":"2025-11-22T02:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.352162 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.352232 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.352252 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.352281 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.352300 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:38Z","lastTransitionTime":"2025-11-22T02:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.455569 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.456003 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.456025 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.456046 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.456060 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:38Z","lastTransitionTime":"2025-11-22T02:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.530280 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.530355 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:38 crc kubenswrapper[4952]: E1122 02:54:38.530827 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:38 crc kubenswrapper[4952]: E1122 02:54:38.531191 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.559435 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.559507 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.559528 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.559582 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.559604 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:38Z","lastTransitionTime":"2025-11-22T02:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.662610 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.662688 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.662712 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.662739 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.662757 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:38Z","lastTransitionTime":"2025-11-22T02:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.765789 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.765872 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.765892 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.765920 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.765935 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:38Z","lastTransitionTime":"2025-11-22T02:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.869481 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.869585 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.869606 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.869631 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.869650 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:38Z","lastTransitionTime":"2025-11-22T02:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.973918 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.974001 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.974018 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.974047 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:38 crc kubenswrapper[4952]: I1122 02:54:38.974070 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:38Z","lastTransitionTime":"2025-11-22T02:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.077751 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.077831 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.077844 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.077868 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.077881 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:39Z","lastTransitionTime":"2025-11-22T02:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.186444 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.186823 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.186870 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.186911 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.186939 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:39Z","lastTransitionTime":"2025-11-22T02:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.291242 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.291313 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.291332 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.291358 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.291378 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:39Z","lastTransitionTime":"2025-11-22T02:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.394632 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.394703 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.394722 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.394748 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.394766 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:39Z","lastTransitionTime":"2025-11-22T02:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.498052 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.498106 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.498117 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.498136 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.498149 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:39Z","lastTransitionTime":"2025-11-22T02:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.530422 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.530422 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:39 crc kubenswrapper[4952]: E1122 02:54:39.530666 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:39 crc kubenswrapper[4952]: E1122 02:54:39.530733 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.602289 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.602405 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.602425 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.602491 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.602513 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:39Z","lastTransitionTime":"2025-11-22T02:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.706496 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.706674 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.706740 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.706777 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.706838 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:39Z","lastTransitionTime":"2025-11-22T02:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.810721 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.810843 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.810866 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.810928 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.810947 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:39Z","lastTransitionTime":"2025-11-22T02:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.914254 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.914344 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.914363 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.914394 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:39 crc kubenswrapper[4952]: I1122 02:54:39.914420 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:39Z","lastTransitionTime":"2025-11-22T02:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.018067 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.018133 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.018157 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.018191 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.018213 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:40Z","lastTransitionTime":"2025-11-22T02:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.122026 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.122090 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.122101 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.122121 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.122133 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:40Z","lastTransitionTime":"2025-11-22T02:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.226035 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.226100 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.226119 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.226146 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.226168 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:40Z","lastTransitionTime":"2025-11-22T02:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.329858 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.329942 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.329967 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.330006 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.330030 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:40Z","lastTransitionTime":"2025-11-22T02:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.433679 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.433760 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.433778 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.433804 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.433829 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:40Z","lastTransitionTime":"2025-11-22T02:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.530125 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.530126 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:40 crc kubenswrapper[4952]: E1122 02:54:40.530386 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:54:40 crc kubenswrapper[4952]: E1122 02:54:40.530591 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.536760 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.536830 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.536857 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.536887 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.536914 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:40Z","lastTransitionTime":"2025-11-22T02:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.552028 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs\") pod \"network-metrics-daemon-gkngm\" (UID: \"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\") " pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:40 crc kubenswrapper[4952]: E1122 02:54:40.552246 4952 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:54:40 crc kubenswrapper[4952]: E1122 02:54:40.552362 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs podName:c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc nodeName:}" failed. No retries permitted until 2025-11-22 02:54:48.55233279 +0000 UTC m=+52.858350093 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs") pod "network-metrics-daemon-gkngm" (UID: "c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.639885 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.639937 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.639958 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.639980 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.639998 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:40Z","lastTransitionTime":"2025-11-22T02:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.743533 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.743620 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.743637 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.743660 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.743677 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:40Z","lastTransitionTime":"2025-11-22T02:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.846943 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.847046 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.847069 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.847624 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.847875 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:40Z","lastTransitionTime":"2025-11-22T02:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.950984 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.951032 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.951048 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.951074 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:40 crc kubenswrapper[4952]: I1122 02:54:40.951091 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:40Z","lastTransitionTime":"2025-11-22T02:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.054521 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.054615 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.054634 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.054657 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.054674 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:41Z","lastTransitionTime":"2025-11-22T02:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.158123 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.158203 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.158221 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.158248 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.158270 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:41Z","lastTransitionTime":"2025-11-22T02:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.261322 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.261379 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.261397 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.261423 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.261440 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:41Z","lastTransitionTime":"2025-11-22T02:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.364538 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.364713 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.364736 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.364775 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.364799 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:41Z","lastTransitionTime":"2025-11-22T02:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.472535 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.472648 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.472672 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.472706 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.472730 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:41Z","lastTransitionTime":"2025-11-22T02:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.530934 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.531072 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:41 crc kubenswrapper[4952]: E1122 02:54:41.531169 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:41 crc kubenswrapper[4952]: E1122 02:54:41.531341 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.581944 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.582036 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.582072 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.582103 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.582125 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:41Z","lastTransitionTime":"2025-11-22T02:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.686738 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.686816 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.686834 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.686862 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.686882 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:41Z","lastTransitionTime":"2025-11-22T02:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.791002 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.791069 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.791093 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.791122 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.791143 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:41Z","lastTransitionTime":"2025-11-22T02:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.894192 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.894353 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.894376 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.894436 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.894462 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:41Z","lastTransitionTime":"2025-11-22T02:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.997680 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.997762 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.997785 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.997815 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:41 crc kubenswrapper[4952]: I1122 02:54:41.997833 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:41Z","lastTransitionTime":"2025-11-22T02:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.102067 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.102135 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.102171 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.102203 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.102266 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:42Z","lastTransitionTime":"2025-11-22T02:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.205793 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.205871 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.205908 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.206036 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.206064 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:42Z","lastTransitionTime":"2025-11-22T02:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.310002 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.310110 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.310128 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.310158 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.310178 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:42Z","lastTransitionTime":"2025-11-22T02:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.414077 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.414146 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.414172 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.414211 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.414247 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:42Z","lastTransitionTime":"2025-11-22T02:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.518500 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.518600 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.518618 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.518645 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.518664 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:42Z","lastTransitionTime":"2025-11-22T02:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.531048 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.531080 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:42 crc kubenswrapper[4952]: E1122 02:54:42.531244 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:42 crc kubenswrapper[4952]: E1122 02:54:42.531441 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.622044 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.622094 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.622112 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.622134 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.622151 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:42Z","lastTransitionTime":"2025-11-22T02:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.726919 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.727007 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.727035 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.727063 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.727098 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:42Z","lastTransitionTime":"2025-11-22T02:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.825434 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.826865 4952 scope.go:117] "RemoveContainer" containerID="f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.830166 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.830237 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.830263 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.830292 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.830315 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:42Z","lastTransitionTime":"2025-11-22T02:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.933840 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.934446 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.934469 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.934497 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:42 crc kubenswrapper[4952]: I1122 02:54:42.934516 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:42Z","lastTransitionTime":"2025-11-22T02:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.037995 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.038045 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.038059 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.038076 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.038088 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:43Z","lastTransitionTime":"2025-11-22T02:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.140722 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.140782 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.140806 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.140836 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.140858 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:43Z","lastTransitionTime":"2025-11-22T02:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.245309 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.245366 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.245377 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.245395 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.245406 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:43Z","lastTransitionTime":"2025-11-22T02:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.347727 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.347765 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.347775 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.347790 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.347801 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:43Z","lastTransitionTime":"2025-11-22T02:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.417258 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.417312 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.417322 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.417340 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.417351 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:43Z","lastTransitionTime":"2025-11-22T02:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:43 crc kubenswrapper[4952]: E1122 02:54:43.430471 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:43Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.433824 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.433858 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.433869 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.433889 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.433900 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:43Z","lastTransitionTime":"2025-11-22T02:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:43 crc kubenswrapper[4952]: E1122 02:54:43.447939 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:43Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.455390 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.455431 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.455442 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.455459 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.455470 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:43Z","lastTransitionTime":"2025-11-22T02:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:43 crc kubenswrapper[4952]: E1122 02:54:43.469576 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:43Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.473777 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.473804 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.473814 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.473829 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.473839 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:43Z","lastTransitionTime":"2025-11-22T02:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:43 crc kubenswrapper[4952]: E1122 02:54:43.488359 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:43Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.491369 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.491397 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.491405 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.491419 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.491429 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:43Z","lastTransitionTime":"2025-11-22T02:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:43 crc kubenswrapper[4952]: E1122 02:54:43.507778 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:43Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:43 crc kubenswrapper[4952]: E1122 02:54:43.507939 4952 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.510014 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.510086 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.510117 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.510139 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.510156 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:43Z","lastTransitionTime":"2025-11-22T02:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.530578 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.530623 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:43 crc kubenswrapper[4952]: E1122 02:54:43.530751 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:43 crc kubenswrapper[4952]: E1122 02:54:43.530858 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.613473 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.613522 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.613533 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.613571 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.613585 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:43Z","lastTransitionTime":"2025-11-22T02:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.716083 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.716117 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.716128 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.716143 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.716153 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:43Z","lastTransitionTime":"2025-11-22T02:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.819965 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.820037 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.820056 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.820086 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.820105 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:43Z","lastTransitionTime":"2025-11-22T02:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.923603 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.923706 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.923726 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.923790 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.923810 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:43Z","lastTransitionTime":"2025-11-22T02:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.945938 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnw6b_bef051cd-2285-4b6b-a16f-1154f4d1f5dd/ovnkube-controller/2.log" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.947053 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnw6b_bef051cd-2285-4b6b-a16f-1154f4d1f5dd/ovnkube-controller/1.log" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.951875 4952 generic.go:334] "Generic (PLEG): container finished" podID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerID="5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477" exitCode=1 Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.951948 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerDied","Data":"5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477"} Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.952012 4952 scope.go:117] "RemoveContainer" containerID="f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.953262 4952 scope.go:117] "RemoveContainer" containerID="5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477" Nov 22 02:54:43 crc kubenswrapper[4952]: E1122 02:54:43.953535 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" Nov 22 02:54:43 crc kubenswrapper[4952]: I1122 02:54:43.998994 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:43Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.021605 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.026853 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.026918 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.026936 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.026987 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.027009 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:44Z","lastTransitionTime":"2025-11-22T02:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.042201 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.060416 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.079132 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eabf937d41275fd8b68da6fbe05fe8fc415fe89e3fed41cee305a00750bd4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b96da655c6ec124324a850442d9bd6deffb8e111f4882435404258f0e1351f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jkv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.099488 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.119024 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.129986 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.130054 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.130074 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.130102 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.130122 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:44Z","lastTransitionTime":"2025-11-22T02:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.143012 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d9cca26b3003c2c5dfada813d6ff241396b642ededd3cb1ec2fed20a4b62bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.163311 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.203762 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:28Z\\\",\\\"message\\\":\\\"0.0.1:29103\\\\\\\"\\\\nI1122 02:54:28.913401 6417 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1122 02:54:28.912936 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1122 02:54:28.913425 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nI1122 02:54:28.913464 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-7wlpk\\\\nI1122 02:54:28.913479 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nF1122 02:54:28.913480 6417 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI1122 02:54:43.784188 6644 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1122 02:54:43.784273 6644 factory.go:1336] Added *v1.Node event handler 7\\\\nI1122 02:54:43.784314 6644 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784615 6644 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1122 02:54:43.784613 6644 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 02:54:43.784638 6644 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 02:54:43.784651 6644 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 02:54:43.784699 6644 factory.go:656] Stopping watch factory\\\\nI1122 02:54:43.784711 6644 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1122 02:54:43.784723 6644 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784734 6644 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 02:54:43.784742 6644 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 02:54:43.784743 6644 ovnkube.go:599] Stopped ovnkube\\\\nI1122 02:54:43.784788 6644 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 02:54:43.784896 6644 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.227913 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.233787 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.233834 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.233848 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.233871 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.233883 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:44Z","lastTransitionTime":"2025-11-22T02:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.250611 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.272813 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.299063 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.318010 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.335333 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.336806 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.336848 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.336865 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.336891 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.336909 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:44Z","lastTransitionTime":"2025-11-22T02:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.352518 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gkngm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gkngm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.440529 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.440666 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.440683 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.440706 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.440725 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:44Z","lastTransitionTime":"2025-11-22T02:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.531294 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.531397 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:44 crc kubenswrapper[4952]: E1122 02:54:44.531602 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:54:44 crc kubenswrapper[4952]: E1122 02:54:44.531713 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.543716 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.543775 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.543796 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.543818 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.543838 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:44Z","lastTransitionTime":"2025-11-22T02:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.647286 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.647331 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.647350 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.647375 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.647392 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:44Z","lastTransitionTime":"2025-11-22T02:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.750239 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.750302 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.750321 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.750348 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.750367 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:44Z","lastTransitionTime":"2025-11-22T02:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.853721 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.853779 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.853797 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.853821 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.853842 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:44Z","lastTransitionTime":"2025-11-22T02:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.956377 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.956448 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.956473 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.956502 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.956522 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:44Z","lastTransitionTime":"2025-11-22T02:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:44 crc kubenswrapper[4952]: I1122 02:54:44.959750 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnw6b_bef051cd-2285-4b6b-a16f-1154f4d1f5dd/ovnkube-controller/2.log" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.060047 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.060110 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.060132 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.060159 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.060181 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:45Z","lastTransitionTime":"2025-11-22T02:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.164096 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.164164 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.164185 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.164210 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.164232 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:45Z","lastTransitionTime":"2025-11-22T02:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.267346 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.267418 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.267445 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.267480 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.267503 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:45Z","lastTransitionTime":"2025-11-22T02:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.370146 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.370242 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.370265 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.370296 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.370324 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:45Z","lastTransitionTime":"2025-11-22T02:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.474486 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.474612 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.474635 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.474663 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.474682 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:45Z","lastTransitionTime":"2025-11-22T02:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.530828 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.530885 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:45 crc kubenswrapper[4952]: E1122 02:54:45.531029 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:45 crc kubenswrapper[4952]: E1122 02:54:45.531206 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.578703 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.578829 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.578856 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.578888 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.578912 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:45Z","lastTransitionTime":"2025-11-22T02:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.682089 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.682188 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.682217 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.682246 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.682264 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:45Z","lastTransitionTime":"2025-11-22T02:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.785449 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.785518 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.785579 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.785614 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.785636 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:45Z","lastTransitionTime":"2025-11-22T02:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.888720 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.888788 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.888805 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.888830 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.888849 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:45Z","lastTransitionTime":"2025-11-22T02:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.991753 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.991821 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.991839 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.991864 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:45 crc kubenswrapper[4952]: I1122 02:54:45.991883 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:45Z","lastTransitionTime":"2025-11-22T02:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.094744 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.094811 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.094829 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.094856 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.094891 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:46Z","lastTransitionTime":"2025-11-22T02:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.197988 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.198050 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.198067 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.198091 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.198112 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:46Z","lastTransitionTime":"2025-11-22T02:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.306979 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.307046 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.307064 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.307090 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.307111 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:46Z","lastTransitionTime":"2025-11-22T02:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.410430 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.410505 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.410523 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.410582 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.410609 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:46Z","lastTransitionTime":"2025-11-22T02:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.514666 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.514743 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.514807 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.514842 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.514867 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:46Z","lastTransitionTime":"2025-11-22T02:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.530574 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.530574 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:46 crc kubenswrapper[4952]: E1122 02:54:46.530838 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:54:46 crc kubenswrapper[4952]: E1122 02:54:46.531135 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.556483 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:46Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.578813 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:46Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.597234 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:46Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.616498 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:46Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.619592 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.619671 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.619697 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.619726 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.619745 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:46Z","lastTransitionTime":"2025-11-22T02:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.634143 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gkngm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gkngm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:46Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.652843 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:46Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.687535 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:46Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.713779 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:46Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.722136 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.722186 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.722205 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.722226 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.722241 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:46Z","lastTransitionTime":"2025-11-22T02:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.734771 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:46Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.749295 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eabf937d41275fd8b68da6fbe05fe8fc415fe89e3fed41cee305a00750bd4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b96da655c6ec124324a850442d9bd6deffb8e111f4882435404258f0e1351f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jkv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:46Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.773713 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:46Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.788468 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:46Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.803909 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:46Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.819371 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d9cca26b3003c2c5dfada813d6ff241396b642ededd3cb1ec2fed20a4b62bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:46Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.824684 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.824710 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.824719 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.824735 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.824749 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:46Z","lastTransitionTime":"2025-11-22T02:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.834114 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:46Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.858322 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:28Z\\\",\\\"message\\\":\\\"0.0.1:29103\\\\\\\"\\\\nI1122 02:54:28.913401 6417 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1122 02:54:28.912936 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1122 02:54:28.913425 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nI1122 02:54:28.913464 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-7wlpk\\\\nI1122 02:54:28.913479 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nF1122 02:54:28.913480 6417 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI1122 02:54:43.784188 6644 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1122 02:54:43.784273 6644 factory.go:1336] Added *v1.Node event handler 7\\\\nI1122 02:54:43.784314 6644 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784615 6644 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1122 02:54:43.784613 6644 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 02:54:43.784638 6644 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 02:54:43.784651 6644 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 02:54:43.784699 6644 factory.go:656] Stopping watch factory\\\\nI1122 02:54:43.784711 6644 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1122 02:54:43.784723 6644 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784734 6644 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 02:54:43.784742 6644 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 02:54:43.784743 6644 ovnkube.go:599] Stopped ovnkube\\\\nI1122 02:54:43.784788 6644 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 02:54:43.784896 6644 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:46Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.881253 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:46Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.929280 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.929360 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.929408 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.929433 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:46 crc kubenswrapper[4952]: I1122 02:54:46.929451 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:46Z","lastTransitionTime":"2025-11-22T02:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.033848 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.033910 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.033928 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.033953 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.033972 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:47Z","lastTransitionTime":"2025-11-22T02:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.136626 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.137086 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.137258 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.137455 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.137680 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:47Z","lastTransitionTime":"2025-11-22T02:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.241192 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.241598 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.241758 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.241909 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.242037 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:47Z","lastTransitionTime":"2025-11-22T02:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.346044 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.346102 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.346122 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.346147 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.346165 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:47Z","lastTransitionTime":"2025-11-22T02:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.449618 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.449696 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.449712 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.449740 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.449759 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:47Z","lastTransitionTime":"2025-11-22T02:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.530669 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.530780 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:47 crc kubenswrapper[4952]: E1122 02:54:47.530886 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:47 crc kubenswrapper[4952]: E1122 02:54:47.530992 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.553165 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.553222 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.553246 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.553278 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.553300 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:47Z","lastTransitionTime":"2025-11-22T02:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.656732 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.656809 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.656835 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.656865 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.656886 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:47Z","lastTransitionTime":"2025-11-22T02:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.760623 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.760703 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.760726 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.760754 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.760772 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:47Z","lastTransitionTime":"2025-11-22T02:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.864455 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.864528 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.864572 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.864599 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.864617 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:47Z","lastTransitionTime":"2025-11-22T02:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.974821 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.974993 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.975018 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.975086 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:47 crc kubenswrapper[4952]: I1122 02:54:47.975110 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:47Z","lastTransitionTime":"2025-11-22T02:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.078924 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.078977 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.078987 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.079004 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.079015 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:48Z","lastTransitionTime":"2025-11-22T02:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.182497 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.182621 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.182641 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.182668 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.182692 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:48Z","lastTransitionTime":"2025-11-22T02:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.243489 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.243730 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:48 crc kubenswrapper[4952]: E1122 02:54:48.243820 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:20.243775881 +0000 UTC m=+84.549793194 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.243891 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:48 crc kubenswrapper[4952]: E1122 02:54:48.243943 4952 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.243991 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:48 crc kubenswrapper[4952]: E1122 02:54:48.244030 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:55:20.244004428 +0000 UTC m=+84.550021731 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.244066 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:48 crc kubenswrapper[4952]: E1122 02:54:48.244162 4952 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:54:48 crc kubenswrapper[4952]: E1122 02:54:48.244184 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:54:48 crc kubenswrapper[4952]: E1122 02:54:48.244210 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:54:48 crc kubenswrapper[4952]: E1122 02:54:48.244232 4952 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:48 crc kubenswrapper[4952]: E1122 02:54:48.244360 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:54:48 crc kubenswrapper[4952]: E1122 02:54:48.244379 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:54:48 crc kubenswrapper[4952]: E1122 02:54:48.244398 4952 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:48 crc kubenswrapper[4952]: E1122 02:54:48.244210 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:55:20.244195913 +0000 UTC m=+84.550213216 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:54:48 crc kubenswrapper[4952]: E1122 02:54:48.244455 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 02:55:20.244439739 +0000 UTC m=+84.550457052 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:48 crc kubenswrapper[4952]: E1122 02:54:48.244478 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 02:55:20.244466659 +0000 UTC m=+84.550483962 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.285658 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.285724 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.285745 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.285774 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.285795 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:48Z","lastTransitionTime":"2025-11-22T02:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.389878 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.389937 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.389949 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.389972 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.389983 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:48Z","lastTransitionTime":"2025-11-22T02:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.493389 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.493478 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.493495 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.493522 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.493564 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:48Z","lastTransitionTime":"2025-11-22T02:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.531010 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.531053 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:48 crc kubenswrapper[4952]: E1122 02:54:48.531255 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:48 crc kubenswrapper[4952]: E1122 02:54:48.531407 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.596746 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.596828 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.596849 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.596878 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.596900 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:48Z","lastTransitionTime":"2025-11-22T02:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.649516 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs\") pod \"network-metrics-daemon-gkngm\" (UID: \"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\") " pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:48 crc kubenswrapper[4952]: E1122 02:54:48.649797 4952 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:54:48 crc kubenswrapper[4952]: E1122 02:54:48.650022 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs podName:c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc nodeName:}" failed. No retries permitted until 2025-11-22 02:55:04.649987364 +0000 UTC m=+68.956004667 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs") pod "network-metrics-daemon-gkngm" (UID: "c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.700525 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.700630 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.700670 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.700714 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.700752 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:48Z","lastTransitionTime":"2025-11-22T02:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.804684 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.804771 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.804797 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.804832 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.804851 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:48Z","lastTransitionTime":"2025-11-22T02:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.907621 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.907667 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.907676 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.907690 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:48 crc kubenswrapper[4952]: I1122 02:54:48.907701 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:48Z","lastTransitionTime":"2025-11-22T02:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.012879 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.012955 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.012974 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.013001 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.013021 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:49Z","lastTransitionTime":"2025-11-22T02:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.120575 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.120643 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.120666 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.120698 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.120721 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:49Z","lastTransitionTime":"2025-11-22T02:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.224422 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.224495 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.224514 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.224570 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.224592 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:49Z","lastTransitionTime":"2025-11-22T02:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.328074 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.328156 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.328176 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.328205 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.328225 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:49Z","lastTransitionTime":"2025-11-22T02:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.431626 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.431692 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.431708 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.431735 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.431755 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:49Z","lastTransitionTime":"2025-11-22T02:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.530784 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.530955 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:49 crc kubenswrapper[4952]: E1122 02:54:49.530999 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:49 crc kubenswrapper[4952]: E1122 02:54:49.531243 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.534983 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.535061 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.535079 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.535102 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.535151 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:49Z","lastTransitionTime":"2025-11-22T02:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.639538 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.639660 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.639687 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.639721 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.639749 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:49Z","lastTransitionTime":"2025-11-22T02:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.743974 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.744044 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.744064 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.744093 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.744116 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:49Z","lastTransitionTime":"2025-11-22T02:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.848001 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.848073 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.848091 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.848118 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.848137 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:49Z","lastTransitionTime":"2025-11-22T02:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.952190 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.952272 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.952335 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.952367 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:49 crc kubenswrapper[4952]: I1122 02:54:49.952385 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:49Z","lastTransitionTime":"2025-11-22T02:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.057166 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.057227 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.057240 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.057262 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.057275 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:50Z","lastTransitionTime":"2025-11-22T02:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.160550 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.160604 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.160614 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.160629 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.160642 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:50Z","lastTransitionTime":"2025-11-22T02:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.263745 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.263825 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.263851 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.263882 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.263906 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:50Z","lastTransitionTime":"2025-11-22T02:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.367377 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.367481 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.367500 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.367525 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.367549 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:50Z","lastTransitionTime":"2025-11-22T02:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.471616 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.471729 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.471792 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.471882 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.471904 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:50Z","lastTransitionTime":"2025-11-22T02:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.531259 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.531259 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:50 crc kubenswrapper[4952]: E1122 02:54:50.531624 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:54:50 crc kubenswrapper[4952]: E1122 02:54:50.531914 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.576088 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.576173 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.576193 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.576222 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.576241 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:50Z","lastTransitionTime":"2025-11-22T02:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.680068 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.680141 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.680159 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.680189 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.680207 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:50Z","lastTransitionTime":"2025-11-22T02:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.783724 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.783794 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.783815 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.783842 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.783860 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:50Z","lastTransitionTime":"2025-11-22T02:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.887063 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.887175 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.887197 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.887226 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.887244 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:50Z","lastTransitionTime":"2025-11-22T02:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.989042 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.989079 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.989090 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.989109 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:50 crc kubenswrapper[4952]: I1122 02:54:50.989124 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:50Z","lastTransitionTime":"2025-11-22T02:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.092972 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.093038 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.093056 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.093078 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.093096 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:51Z","lastTransitionTime":"2025-11-22T02:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.196195 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.196254 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.196270 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.196294 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.196312 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:51Z","lastTransitionTime":"2025-11-22T02:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.299201 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.299263 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.299322 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.299348 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.299367 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:51Z","lastTransitionTime":"2025-11-22T02:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.403201 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.403299 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.403330 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.403367 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.403398 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:51Z","lastTransitionTime":"2025-11-22T02:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.506739 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.506810 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.506829 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.506856 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.506875 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:51Z","lastTransitionTime":"2025-11-22T02:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.530165 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.530245 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:51 crc kubenswrapper[4952]: E1122 02:54:51.530398 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:51 crc kubenswrapper[4952]: E1122 02:54:51.530633 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.615615 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.615690 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.615716 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.615746 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.615768 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:51Z","lastTransitionTime":"2025-11-22T02:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.719599 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.719695 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.719721 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.719756 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.719780 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:51Z","lastTransitionTime":"2025-11-22T02:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.800066 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.817289 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:51Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.823120 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.823181 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.823201 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.823227 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.823246 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:51Z","lastTransitionTime":"2025-11-22T02:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.835853 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:51Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.854422 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eabf937d41275fd8b68da6fbe05fe8fc415fe89e3fed41cee305a00750bd4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b96da655c6ec124324a850442d9bd6deffb8e111f4882435404258f0e1351f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jkv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:51Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.887490 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:51Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.908117 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:51Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.927057 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:51Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.927999 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.928029 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.928038 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.928052 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.928062 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:51Z","lastTransitionTime":"2025-11-22T02:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.946352 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:51Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.972837 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.973556 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d9cca26b3003c2c5dfada813d6ff241396b642ededd3cb1ec2fed20a4b62bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:51Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.987161 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 22 02:54:51 crc kubenswrapper[4952]: I1122 02:54:51.994070 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:51Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.026130 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:28Z\\\",\\\"message\\\":\\\"0.0.1:29103\\\\\\\"\\\\nI1122 02:54:28.913401 6417 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1122 02:54:28.912936 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1122 02:54:28.913425 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nI1122 02:54:28.913464 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-7wlpk\\\\nI1122 02:54:28.913479 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nF1122 02:54:28.913480 6417 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI1122 02:54:43.784188 6644 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1122 02:54:43.784273 6644 factory.go:1336] Added *v1.Node event handler 7\\\\nI1122 02:54:43.784314 6644 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784615 6644 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1122 02:54:43.784613 6644 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 02:54:43.784638 6644 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 02:54:43.784651 6644 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 02:54:43.784699 6644 factory.go:656] Stopping watch factory\\\\nI1122 02:54:43.784711 6644 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1122 02:54:43.784723 6644 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784734 6644 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 02:54:43.784742 6644 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 02:54:43.784743 6644 ovnkube.go:599] Stopped ovnkube\\\\nI1122 02:54:43.784788 6644 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 02:54:43.784896 6644 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.033387 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.033452 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.033470 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.033500 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.033518 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:52Z","lastTransitionTime":"2025-11-22T02:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.053968 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.077699 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.095166 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.112764 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.132065 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gkngm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gkngm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.136886 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.136939 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.136956 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.136981 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.136999 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:52Z","lastTransitionTime":"2025-11-22T02:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.156267 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.175064 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.192685 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.214686 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.234819 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d9cca26b3003c2c5dfada813d6ff241396b642ededd3cb1ec2fed20a4b62bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.240427 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.240510 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.240540 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.240618 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.240643 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:52Z","lastTransitionTime":"2025-11-22T02:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.255870 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.286329 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:28Z\\\",\\\"message\\\":\\\"0.0.1:29103\\\\\\\"\\\\nI1122 02:54:28.913401 6417 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1122 02:54:28.912936 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1122 02:54:28.913425 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nI1122 02:54:28.913464 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-7wlpk\\\\nI1122 02:54:28.913479 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nF1122 02:54:28.913480 6417 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI1122 02:54:43.784188 6644 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1122 02:54:43.784273 6644 factory.go:1336] Added *v1.Node event handler 7\\\\nI1122 02:54:43.784314 6644 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784615 6644 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1122 02:54:43.784613 6644 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 02:54:43.784638 6644 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 02:54:43.784651 6644 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 02:54:43.784699 6644 factory.go:656] Stopping watch factory\\\\nI1122 02:54:43.784711 6644 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1122 02:54:43.784723 6644 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784734 6644 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 02:54:43.784742 6644 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 02:54:43.784743 6644 ovnkube.go:599] Stopped ovnkube\\\\nI1122 02:54:43.784788 6644 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 02:54:43.784896 6644 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.312091 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.333666 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc93cee4-8fff-4eac-b7ee-9a0e550c7d62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bbb9717ab2c2be566e304ddda4cb8e43d9010f4fa4a663bbf5734fb36399f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a681e7f405c868e2932037f2542ba2aa2666f8ff23e776d0c952974398d282fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a6011947698e74ae244926c4fc492bde121b2f435911005424b9280325361a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.344226 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.344287 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.344307 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.344339 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.344360 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:52Z","lastTransitionTime":"2025-11-22T02:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.355307 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.379482 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.398362 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.417799 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.436954 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gkngm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gkngm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.448409 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.448510 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.448530 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.448589 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.448610 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:52Z","lastTransitionTime":"2025-11-22T02:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.459669 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.483415 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.505918 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.527246 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.530684 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.530786 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:52 crc kubenswrapper[4952]: E1122 02:54:52.530897 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:54:52 crc kubenswrapper[4952]: E1122 02:54:52.531022 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.552382 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.552462 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.552389 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eabf937d41275fd8b68da6fbe05fe8fc415fe89e3fed41cee305a00750bd4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b96da655c6ec124324a850442d9bd6deffb8e111f4882435404258f0e1351f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jkv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.552487 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.552525 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.552589 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:52Z","lastTransitionTime":"2025-11-22T02:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.590077 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:52Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.656434 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.656541 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.656588 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.656614 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.656633 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:52Z","lastTransitionTime":"2025-11-22T02:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.760728 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.760826 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.760865 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.760903 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.760923 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:52Z","lastTransitionTime":"2025-11-22T02:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.865725 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.865791 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.865803 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.865827 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.865845 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:52Z","lastTransitionTime":"2025-11-22T02:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.969117 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.969191 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.969212 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.969245 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:52 crc kubenswrapper[4952]: I1122 02:54:52.969315 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:52Z","lastTransitionTime":"2025-11-22T02:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.072793 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.072887 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.072913 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.072947 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.072974 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:53Z","lastTransitionTime":"2025-11-22T02:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.176632 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.176698 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.176716 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.176743 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.176761 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:53Z","lastTransitionTime":"2025-11-22T02:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.280143 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.280213 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.280235 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.280261 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.280281 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:53Z","lastTransitionTime":"2025-11-22T02:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.383917 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.383986 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.383998 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.384021 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.384042 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:53Z","lastTransitionTime":"2025-11-22T02:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.487950 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.488046 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.488068 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.488094 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.488115 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:53Z","lastTransitionTime":"2025-11-22T02:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.530204 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:53 crc kubenswrapper[4952]: E1122 02:54:53.530482 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.531688 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:53 crc kubenswrapper[4952]: E1122 02:54:53.531961 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.534253 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.534322 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.534334 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.534357 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.534375 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:53Z","lastTransitionTime":"2025-11-22T02:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:53 crc kubenswrapper[4952]: E1122 02:54:53.557870 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:53Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.563692 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.563728 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.563741 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.563760 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.563773 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:53Z","lastTransitionTime":"2025-11-22T02:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:53 crc kubenswrapper[4952]: E1122 02:54:53.585375 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:53Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.592054 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.592132 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.592155 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.592184 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.592206 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:53Z","lastTransitionTime":"2025-11-22T02:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:53 crc kubenswrapper[4952]: E1122 02:54:53.616937 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:53Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.622702 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.622799 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.622819 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.622842 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.622858 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:53Z","lastTransitionTime":"2025-11-22T02:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:53 crc kubenswrapper[4952]: E1122 02:54:53.644197 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:53Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.649970 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.650034 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.650055 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.650090 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.650101 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:53Z","lastTransitionTime":"2025-11-22T02:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:53 crc kubenswrapper[4952]: E1122 02:54:53.674434 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:53Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:53 crc kubenswrapper[4952]: E1122 02:54:53.674578 4952 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.678135 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.678170 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.678182 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.678201 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.678214 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:53Z","lastTransitionTime":"2025-11-22T02:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.781963 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.782020 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.782029 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.782047 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.782058 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:53Z","lastTransitionTime":"2025-11-22T02:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.886196 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.886253 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.886273 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.886302 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.886320 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:53Z","lastTransitionTime":"2025-11-22T02:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.994254 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.994385 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.994417 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.994479 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:53 crc kubenswrapper[4952]: I1122 02:54:53.994503 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:53Z","lastTransitionTime":"2025-11-22T02:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.098323 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.098415 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.098435 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.098466 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.098489 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:54Z","lastTransitionTime":"2025-11-22T02:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.202180 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.202298 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.202317 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.202347 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.202378 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:54Z","lastTransitionTime":"2025-11-22T02:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.306490 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.306625 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.306657 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.306694 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.306720 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:54Z","lastTransitionTime":"2025-11-22T02:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.410638 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.410745 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.410764 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.410790 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.410813 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:54Z","lastTransitionTime":"2025-11-22T02:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.517328 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.517408 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.517429 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.517460 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.517480 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:54Z","lastTransitionTime":"2025-11-22T02:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.531170 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.531260 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:54 crc kubenswrapper[4952]: E1122 02:54:54.531422 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:54:54 crc kubenswrapper[4952]: E1122 02:54:54.531817 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.622086 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.622149 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.622166 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.622192 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.622211 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:54Z","lastTransitionTime":"2025-11-22T02:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.726848 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.726930 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.726948 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.726978 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.726998 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:54Z","lastTransitionTime":"2025-11-22T02:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.831091 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.831165 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.831188 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.831216 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.831237 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:54Z","lastTransitionTime":"2025-11-22T02:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.934848 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.934925 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.934943 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.934971 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:54 crc kubenswrapper[4952]: I1122 02:54:54.934996 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:54Z","lastTransitionTime":"2025-11-22T02:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.038194 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.038269 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.038288 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.038323 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.038345 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:55Z","lastTransitionTime":"2025-11-22T02:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.142062 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.142122 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.142141 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.142167 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.142188 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:55Z","lastTransitionTime":"2025-11-22T02:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.245769 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.245841 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.245865 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.245899 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.245920 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:55Z","lastTransitionTime":"2025-11-22T02:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.349645 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.349714 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.349738 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.349772 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.349797 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:55Z","lastTransitionTime":"2025-11-22T02:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.452926 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.453006 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.453029 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.453066 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.453086 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:55Z","lastTransitionTime":"2025-11-22T02:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.530913 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.531048 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:55 crc kubenswrapper[4952]: E1122 02:54:55.531171 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:55 crc kubenswrapper[4952]: E1122 02:54:55.531345 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.555989 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.556047 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.556064 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.556087 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.556109 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:55Z","lastTransitionTime":"2025-11-22T02:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.659391 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.659468 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.659488 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.659516 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.659535 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:55Z","lastTransitionTime":"2025-11-22T02:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.777275 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.777348 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.777361 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.777386 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.777402 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:55Z","lastTransitionTime":"2025-11-22T02:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.882441 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.882513 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.882528 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.882577 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.882594 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:55Z","lastTransitionTime":"2025-11-22T02:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.987067 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.987139 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.987156 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.987184 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:55 crc kubenswrapper[4952]: I1122 02:54:55.987202 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:55Z","lastTransitionTime":"2025-11-22T02:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.089722 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.089783 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.089809 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.089834 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.089853 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:56Z","lastTransitionTime":"2025-11-22T02:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.192218 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.192279 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.192303 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.192334 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.192357 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:56Z","lastTransitionTime":"2025-11-22T02:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.295906 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.295983 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.296002 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.296037 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.296064 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:56Z","lastTransitionTime":"2025-11-22T02:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.400049 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.400131 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.400157 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.400200 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.400224 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:56Z","lastTransitionTime":"2025-11-22T02:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.502989 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.503607 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.503627 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.503655 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.503672 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:56Z","lastTransitionTime":"2025-11-22T02:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.530367 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.530452 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:56 crc kubenswrapper[4952]: E1122 02:54:56.530652 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:54:56 crc kubenswrapper[4952]: E1122 02:54:56.530802 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.555135 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d9cca26b3003c2c5dfada813d6ff241396b642ededd3cb1ec2fed20a4b62bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.576846 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.608396 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.608463 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.608480 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.608509 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.608527 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:56Z","lastTransitionTime":"2025-11-22T02:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.609936 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f88772c724f002a09ef6ff0197a2bb73200e8b375764aae0cd2d80d8cfa83579\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:28Z\\\",\\\"message\\\":\\\"0.0.1:29103\\\\\\\"\\\\nI1122 02:54:28.913401 6417 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1122 02:54:28.912936 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1122 02:54:28.913425 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nI1122 02:54:28.913464 6417 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-7wlpk\\\\nI1122 02:54:28.913479 6417 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-ts9bc\\\\nF1122 02:54:28.913480 6417 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI1122 02:54:43.784188 6644 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1122 02:54:43.784273 6644 factory.go:1336] Added *v1.Node event handler 7\\\\nI1122 02:54:43.784314 6644 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784615 6644 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1122 02:54:43.784613 6644 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 02:54:43.784638 6644 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 02:54:43.784651 6644 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 02:54:43.784699 6644 factory.go:656] Stopping watch factory\\\\nI1122 02:54:43.784711 6644 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1122 02:54:43.784723 6644 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784734 6644 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 02:54:43.784742 6644 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 02:54:43.784743 6644 ovnkube.go:599] Stopped ovnkube\\\\nI1122 02:54:43.784788 6644 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 02:54:43.784896 6644 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.634933 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.654364 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc93cee4-8fff-4eac-b7ee-9a0e550c7d62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bbb9717ab2c2be566e304ddda4cb8e43d9010f4fa4a663bbf5734fb36399f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a681e7f405c868e2932037f2542ba2aa2666f8ff23e776d0c952974398d282fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a6011947698e74ae244926c4fc492bde121b2f435911005424b9280325361a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.669030 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.693048 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.707598 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.712068 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.712116 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.712126 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.712144 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.712153 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:56Z","lastTransitionTime":"2025-11-22T02:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.723092 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.738679 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gkngm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gkngm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.753167 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.770591 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.784119 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.795244 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.807475 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eabf937d41275fd8b68da6fbe05fe8fc415fe89e3fed41cee305a00750bd4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b96da655c6ec124324a850442d9bd6deffb8e111f4882435404258f0e1351f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jkv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.814990 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.815053 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.815065 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.815090 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.815103 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:56Z","lastTransitionTime":"2025-11-22T02:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.831041 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.848149 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.864581 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.918569 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.918612 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.918623 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.918644 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:56 crc kubenswrapper[4952]: I1122 02:54:56.918656 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:56Z","lastTransitionTime":"2025-11-22T02:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.021629 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.021718 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.021746 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.021782 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.021804 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:57Z","lastTransitionTime":"2025-11-22T02:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.125711 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.125767 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.125785 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.125809 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.125827 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:57Z","lastTransitionTime":"2025-11-22T02:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.230017 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.230087 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.230101 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.230128 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.230143 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:57Z","lastTransitionTime":"2025-11-22T02:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.333809 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.333882 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.333963 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.334000 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.334024 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:57Z","lastTransitionTime":"2025-11-22T02:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.437449 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.437523 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.437583 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.437624 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.437652 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:57Z","lastTransitionTime":"2025-11-22T02:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.530987 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.531030 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:57 crc kubenswrapper[4952]: E1122 02:54:57.531207 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:57 crc kubenswrapper[4952]: E1122 02:54:57.531816 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.533252 4952 scope.go:117] "RemoveContainer" containerID="5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477" Nov 22 02:54:57 crc kubenswrapper[4952]: E1122 02:54:57.533633 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.541003 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.541079 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.541100 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.541131 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.541149 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:57Z","lastTransitionTime":"2025-11-22T02:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.552942 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc93cee4-8fff-4eac-b7ee-9a0e550c7d62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bbb9717ab2c2be566e304ddda4cb8e43d9010f4fa4a663bbf5734fb36399f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a681e7f405c868e2932037f2542ba2aa2666f8ff23e776d0c952974398d282fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a6011947698e74ae244926c4fc492bde121b2f435911005424b9280325361a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.576156 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d9cca26b3003c2c5dfada813d6ff241396b642ededd3cb1ec2fed20a4b62bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.596431 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.630474 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI1122 02:54:43.784188 6644 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1122 02:54:43.784273 6644 factory.go:1336] Added *v1.Node event handler 7\\\\nI1122 02:54:43.784314 6644 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784615 6644 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1122 02:54:43.784613 6644 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 02:54:43.784638 6644 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 02:54:43.784651 6644 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 02:54:43.784699 6644 factory.go:656] Stopping watch factory\\\\nI1122 02:54:43.784711 6644 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1122 02:54:43.784723 6644 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784734 6644 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 02:54:43.784742 6644 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 02:54:43.784743 6644 ovnkube.go:599] Stopped ovnkube\\\\nI1122 02:54:43.784788 6644 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 02:54:43.784896 6644 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.644707 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.644783 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.644807 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.644838 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.644860 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:57Z","lastTransitionTime":"2025-11-22T02:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.650007 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.669647 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.695158 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.718944 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.737855 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.748742 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.748772 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.748781 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.748795 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.748805 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:57Z","lastTransitionTime":"2025-11-22T02:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.756418 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.780443 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gkngm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gkngm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.818796 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.842544 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.852224 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.852303 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.852324 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.852351 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.852376 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:57Z","lastTransitionTime":"2025-11-22T02:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.865595 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.886583 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.906755 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eabf937d41275fd8b68da6fbe05fe8fc415fe89e3fed41cee305a00750bd4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b96da655c6ec124324a850442d9bd6deffb8e111f4882435404258f0e1351f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jkv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.929792 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.947687 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.955038 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.955110 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.955129 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.955158 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:57 crc kubenswrapper[4952]: I1122 02:54:57.955180 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:57Z","lastTransitionTime":"2025-11-22T02:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.058923 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.058989 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.058998 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.059012 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.059022 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:58Z","lastTransitionTime":"2025-11-22T02:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.161705 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.161763 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.161776 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.161795 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.161808 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:58Z","lastTransitionTime":"2025-11-22T02:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.265290 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.265346 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.265361 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.265381 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.265396 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:58Z","lastTransitionTime":"2025-11-22T02:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.370865 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.370926 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.370936 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.370962 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.370975 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:58Z","lastTransitionTime":"2025-11-22T02:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.474290 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.474411 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.474423 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.474444 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.474461 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:58Z","lastTransitionTime":"2025-11-22T02:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.530913 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.530963 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:58 crc kubenswrapper[4952]: E1122 02:54:58.531139 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:54:58 crc kubenswrapper[4952]: E1122 02:54:58.531324 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.577353 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.577414 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.577425 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.577449 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.577462 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:58Z","lastTransitionTime":"2025-11-22T02:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.680217 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.680287 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.680306 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.680337 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.680356 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:58Z","lastTransitionTime":"2025-11-22T02:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.783668 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.783729 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.783745 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.783770 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.783789 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:58Z","lastTransitionTime":"2025-11-22T02:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.886979 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.887037 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.887060 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.887086 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.887104 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:58Z","lastTransitionTime":"2025-11-22T02:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.991501 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.991597 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.991619 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.991649 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:58 crc kubenswrapper[4952]: I1122 02:54:58.991672 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:58Z","lastTransitionTime":"2025-11-22T02:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.095336 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.095395 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.095413 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.095438 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.095457 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:59Z","lastTransitionTime":"2025-11-22T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.199621 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.199730 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.199754 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.199792 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.199818 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:59Z","lastTransitionTime":"2025-11-22T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.303227 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.303311 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.303340 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.303377 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.303400 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:59Z","lastTransitionTime":"2025-11-22T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.406523 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.406597 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.406608 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.406632 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.406676 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:59Z","lastTransitionTime":"2025-11-22T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.509479 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.509517 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.509525 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.509560 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.509574 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:59Z","lastTransitionTime":"2025-11-22T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.530456 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:59 crc kubenswrapper[4952]: E1122 02:54:59.530632 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.530830 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:59 crc kubenswrapper[4952]: E1122 02:54:59.530885 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.613176 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.613249 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.613272 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.613306 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.613329 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:59Z","lastTransitionTime":"2025-11-22T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.717313 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.717390 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.717402 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.717462 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.717509 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:59Z","lastTransitionTime":"2025-11-22T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.821406 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.821461 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.821470 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.821489 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.821502 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:59Z","lastTransitionTime":"2025-11-22T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.924819 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.924850 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.924859 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.924875 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:59 crc kubenswrapper[4952]: I1122 02:54:59.924886 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:59Z","lastTransitionTime":"2025-11-22T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.027716 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.028148 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.028268 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.028363 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.028444 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:00Z","lastTransitionTime":"2025-11-22T02:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.132726 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.132805 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.132828 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.132859 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.132879 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:00Z","lastTransitionTime":"2025-11-22T02:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.236448 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.236502 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.236517 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.236541 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.236573 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:00Z","lastTransitionTime":"2025-11-22T02:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.339659 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.339714 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.339726 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.339745 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.339756 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:00Z","lastTransitionTime":"2025-11-22T02:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.443264 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.443347 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.443366 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.443396 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.443417 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:00Z","lastTransitionTime":"2025-11-22T02:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.531614 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.531689 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:00 crc kubenswrapper[4952]: E1122 02:55:00.531793 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:00 crc kubenswrapper[4952]: E1122 02:55:00.532142 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.545378 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.545424 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.545434 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.545455 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.545468 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:00Z","lastTransitionTime":"2025-11-22T02:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.648643 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.648691 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.648700 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.648715 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.648729 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:00Z","lastTransitionTime":"2025-11-22T02:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.752661 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.752716 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.752729 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.752750 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.752764 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:00Z","lastTransitionTime":"2025-11-22T02:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.856963 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.857026 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.857044 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.857072 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.857099 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:00Z","lastTransitionTime":"2025-11-22T02:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.961100 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.961171 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.961194 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.961224 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:00 crc kubenswrapper[4952]: I1122 02:55:00.961246 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:00Z","lastTransitionTime":"2025-11-22T02:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.064220 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.064280 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.064296 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.064319 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.064335 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:01Z","lastTransitionTime":"2025-11-22T02:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.167307 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.167362 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.167375 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.167393 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.167408 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:01Z","lastTransitionTime":"2025-11-22T02:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.270406 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.270462 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.270477 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.270504 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.270518 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:01Z","lastTransitionTime":"2025-11-22T02:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.373226 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.373326 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.373337 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.373356 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.373366 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:01Z","lastTransitionTime":"2025-11-22T02:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.476093 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.476160 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.476178 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.476205 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.476223 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:01Z","lastTransitionTime":"2025-11-22T02:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.531008 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.531140 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:01 crc kubenswrapper[4952]: E1122 02:55:01.531260 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:01 crc kubenswrapper[4952]: E1122 02:55:01.531360 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.579669 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.579753 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.580343 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.580373 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.580387 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:01Z","lastTransitionTime":"2025-11-22T02:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.682805 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.682869 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.682884 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.682900 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.682910 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:01Z","lastTransitionTime":"2025-11-22T02:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.785962 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.786050 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.786067 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.786089 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.786102 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:01Z","lastTransitionTime":"2025-11-22T02:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.889250 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.889292 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.889309 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.889335 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.889352 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:01Z","lastTransitionTime":"2025-11-22T02:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.991525 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.991608 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.991626 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.991654 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:01 crc kubenswrapper[4952]: I1122 02:55:01.991674 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:01Z","lastTransitionTime":"2025-11-22T02:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.094435 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.094509 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.094582 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.094603 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.095164 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:02Z","lastTransitionTime":"2025-11-22T02:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.198350 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.198387 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.198396 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.198410 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.198421 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:02Z","lastTransitionTime":"2025-11-22T02:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.301384 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.301443 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.301461 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.301486 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.301504 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:02Z","lastTransitionTime":"2025-11-22T02:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.405254 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.405315 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.405332 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.405357 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.405377 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:02Z","lastTransitionTime":"2025-11-22T02:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.508247 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.508299 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.508312 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.508336 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.508351 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:02Z","lastTransitionTime":"2025-11-22T02:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.530821 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.530872 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:02 crc kubenswrapper[4952]: E1122 02:55:02.531078 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:02 crc kubenswrapper[4952]: E1122 02:55:02.531468 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.612525 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.612639 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.612662 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.612695 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.612714 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:02Z","lastTransitionTime":"2025-11-22T02:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.715953 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.716203 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.716224 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.716250 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.716267 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:02Z","lastTransitionTime":"2025-11-22T02:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.821073 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.821145 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.821164 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.821191 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.821213 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:02Z","lastTransitionTime":"2025-11-22T02:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.924160 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.924229 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.924243 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.924264 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:02 crc kubenswrapper[4952]: I1122 02:55:02.924279 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:02Z","lastTransitionTime":"2025-11-22T02:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.026290 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.026333 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.026341 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.026360 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.026372 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:03Z","lastTransitionTime":"2025-11-22T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.129416 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.129466 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.129477 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.129494 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.129507 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:03Z","lastTransitionTime":"2025-11-22T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.233114 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.233168 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.233181 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.233202 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.233217 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:03Z","lastTransitionTime":"2025-11-22T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.336329 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.336430 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.336446 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.336473 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.336487 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:03Z","lastTransitionTime":"2025-11-22T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.439389 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.439457 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.439471 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.439496 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.439513 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:03Z","lastTransitionTime":"2025-11-22T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.530340 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.530538 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:03 crc kubenswrapper[4952]: E1122 02:55:03.530685 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:03 crc kubenswrapper[4952]: E1122 02:55:03.530802 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.542294 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.542339 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.542354 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.542374 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.542386 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:03Z","lastTransitionTime":"2025-11-22T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.645764 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.645827 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.645839 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.645858 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.645875 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:03Z","lastTransitionTime":"2025-11-22T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.704283 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.704356 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.704376 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.704404 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.704427 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:03Z","lastTransitionTime":"2025-11-22T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:03 crc kubenswrapper[4952]: E1122 02:55:03.721701 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:03Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.727905 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.727962 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.727980 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.728007 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.728026 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:03Z","lastTransitionTime":"2025-11-22T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:03 crc kubenswrapper[4952]: E1122 02:55:03.746412 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:03Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.752332 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.752401 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.752422 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.752453 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.752476 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:03Z","lastTransitionTime":"2025-11-22T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:03 crc kubenswrapper[4952]: E1122 02:55:03.771064 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:03Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.776343 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.776420 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.776439 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.776467 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.776487 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:03Z","lastTransitionTime":"2025-11-22T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:03 crc kubenswrapper[4952]: E1122 02:55:03.801648 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:03Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.807158 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.807242 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.807277 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.807314 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.807341 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:03Z","lastTransitionTime":"2025-11-22T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:03 crc kubenswrapper[4952]: E1122 02:55:03.828126 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:03Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:03 crc kubenswrapper[4952]: E1122 02:55:03.828262 4952 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.829998 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.830047 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.830061 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.830082 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.830097 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:03Z","lastTransitionTime":"2025-11-22T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.933703 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.933766 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.933781 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.933801 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:03 crc kubenswrapper[4952]: I1122 02:55:03.933814 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:03Z","lastTransitionTime":"2025-11-22T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.035900 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.035953 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.035967 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.035985 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.035999 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:04Z","lastTransitionTime":"2025-11-22T02:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.140309 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.140389 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.140401 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.140423 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.140438 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:04Z","lastTransitionTime":"2025-11-22T02:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.244154 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.244223 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.244242 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.244269 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.244284 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:04Z","lastTransitionTime":"2025-11-22T02:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.347051 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.347131 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.347163 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.347213 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.347240 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:04Z","lastTransitionTime":"2025-11-22T02:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.453305 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.453399 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.453420 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.453448 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.453468 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:04Z","lastTransitionTime":"2025-11-22T02:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.530391 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.530612 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:04 crc kubenswrapper[4952]: E1122 02:55:04.530802 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:04 crc kubenswrapper[4952]: E1122 02:55:04.530987 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.545020 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.557103 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.557150 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.557165 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.557191 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.557207 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:04Z","lastTransitionTime":"2025-11-22T02:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.660162 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.660212 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.660225 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.660246 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.660260 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:04Z","lastTransitionTime":"2025-11-22T02:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.743009 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs\") pod \"network-metrics-daemon-gkngm\" (UID: \"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\") " pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:04 crc kubenswrapper[4952]: E1122 02:55:04.743156 4952 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:55:04 crc kubenswrapper[4952]: E1122 02:55:04.743217 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs podName:c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc nodeName:}" failed. No retries permitted until 2025-11-22 02:55:36.743201891 +0000 UTC m=+101.049219164 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs") pod "network-metrics-daemon-gkngm" (UID: "c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.762856 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.762901 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.762915 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.762935 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.762949 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:04Z","lastTransitionTime":"2025-11-22T02:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.865916 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.865972 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.865991 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.866015 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.866034 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:04Z","lastTransitionTime":"2025-11-22T02:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.968493 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.968566 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.968581 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.968603 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:04 crc kubenswrapper[4952]: I1122 02:55:04.968616 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:04Z","lastTransitionTime":"2025-11-22T02:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.071768 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.071851 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.071873 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.071906 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.071930 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:05Z","lastTransitionTime":"2025-11-22T02:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.175212 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.175264 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.175277 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.175298 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.175314 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:05Z","lastTransitionTime":"2025-11-22T02:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.278905 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.278957 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.278969 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.278990 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.279004 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:05Z","lastTransitionTime":"2025-11-22T02:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.382016 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.382070 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.382081 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.382102 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.382115 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:05Z","lastTransitionTime":"2025-11-22T02:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.484663 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.484736 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.484751 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.484771 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.484785 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:05Z","lastTransitionTime":"2025-11-22T02:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.530472 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.530528 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:05 crc kubenswrapper[4952]: E1122 02:55:05.530672 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:05 crc kubenswrapper[4952]: E1122 02:55:05.530771 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.588178 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.588250 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.588260 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.588277 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.588290 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:05Z","lastTransitionTime":"2025-11-22T02:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.691537 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.691606 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.691617 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.691637 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.691650 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:05Z","lastTransitionTime":"2025-11-22T02:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.794743 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.794807 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.794819 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.794853 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.794866 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:05Z","lastTransitionTime":"2025-11-22T02:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.898588 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.898652 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.898665 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.898684 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:05 crc kubenswrapper[4952]: I1122 02:55:05.898701 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:05Z","lastTransitionTime":"2025-11-22T02:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.002435 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.002504 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.002529 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.002603 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.002628 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:06Z","lastTransitionTime":"2025-11-22T02:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.047341 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-j9kg2_ccedfe81-43b3-4af7-88c7-9953b33e7d13/kube-multus/0.log" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.047439 4952 generic.go:334] "Generic (PLEG): container finished" podID="ccedfe81-43b3-4af7-88c7-9953b33e7d13" containerID="6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489" exitCode=1 Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.047503 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j9kg2" event={"ID":"ccedfe81-43b3-4af7-88c7-9953b33e7d13","Type":"ContainerDied","Data":"6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489"} Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.048297 4952 scope.go:117] "RemoveContainer" containerID="6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.064262 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.084764 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.098962 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc93cee4-8fff-4eac-b7ee-9a0e550c7d62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bbb9717ab2c2be566e304ddda4cb8e43d9010f4fa4a663bbf5734fb36399f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a681e7f405c868e2932037f2542ba2aa2666f8ff23e776d0c952974398d282fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a6011947698e74ae244926c4fc492bde121b2f435911005424b9280325361a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.105881 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.105916 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.105931 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.105949 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.105962 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:06Z","lastTransitionTime":"2025-11-22T02:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.111061 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4db386b4-babe-4dc8-bcdc-02763c1c602b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b4302681a039e3cb3783b65bcda4bf5e2e0a03f656c79b871dd22105e1bf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af20c0e9e54a71a3edb2da902a21eac2c66032a80ac644bdb2aa89e99af10630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af20c0e9e54a71a3edb2da902a21eac2c66032a80ac644bdb2aa89e99af10630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.128562 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d9cca26b3003c2c5dfada813d6ff241396b642ededd3cb1ec2fed20a4b62bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.142200 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.166086 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI1122 02:54:43.784188 6644 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1122 02:54:43.784273 6644 factory.go:1336] Added *v1.Node event handler 7\\\\nI1122 02:54:43.784314 6644 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784615 6644 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1122 02:54:43.784613 6644 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 02:54:43.784638 6644 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 02:54:43.784651 6644 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 02:54:43.784699 6644 factory.go:656] Stopping watch factory\\\\nI1122 02:54:43.784711 6644 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1122 02:54:43.784723 6644 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784734 6644 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 02:54:43.784742 6644 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 02:54:43.784743 6644 ovnkube.go:599] Stopped ovnkube\\\\nI1122 02:54:43.784788 6644 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 02:54:43.784896 6644 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.184624 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.200684 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.209174 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.209219 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.209233 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.209254 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.209266 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:06Z","lastTransitionTime":"2025-11-22T02:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.215695 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.231037 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:55:05Z\\\",\\\"message\\\":\\\"2025-11-22T02:54:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a97ad85-d209-4196-9a77-c1f23a382ae5\\\\n2025-11-22T02:54:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a97ad85-d209-4196-9a77-c1f23a382ae5 to /host/opt/cni/bin/\\\\n2025-11-22T02:54:20Z [verbose] multus-daemon started\\\\n2025-11-22T02:54:20Z [verbose] Readiness Indicator file check\\\\n2025-11-22T02:55:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.243228 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.254450 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.265489 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gkngm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gkngm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.311794 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.311848 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.311862 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.311885 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.311902 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:06Z","lastTransitionTime":"2025-11-22T02:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.318007 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.364487 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.385736 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.398821 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.412345 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eabf937d41275fd8b68da6fbe05fe8fc415fe89e3fed41cee305a00750bd4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b96da655c6ec124324a850442d9bd6deffb8e111f4882435404258f0e1351f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jkv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.414706 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.414741 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.414753 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.414770 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.414781 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:06Z","lastTransitionTime":"2025-11-22T02:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.517112 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.517151 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.517163 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.517180 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.517194 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:06Z","lastTransitionTime":"2025-11-22T02:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.530448 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.530472 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:06 crc kubenswrapper[4952]: E1122 02:55:06.530665 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:06 crc kubenswrapper[4952]: E1122 02:55:06.530858 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.549958 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc93cee4-8fff-4eac-b7ee-9a0e550c7d62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bbb9717ab2c2be566e304ddda4cb8e43d9010f4fa4a663bbf5734fb36399f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a681e7f405c868e2932037f2542ba2aa2666f8ff23e776d0c952974398d282fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a6011947698e74ae244926c4fc492bde121b2f435911005424b9280325361a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.564997 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4db386b4-babe-4dc8-bcdc-02763c1c602b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b4302681a039e3cb3783b65bcda4bf5e2e0a03f656c79b871dd22105e1bf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af20c0e9e54a71a3edb2da902a21eac2c66032a80ac644bdb2aa89e99af10630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af20c0e9e54a71a3edb2da902a21eac2c66032a80ac644bdb2aa89e99af10630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.579756 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d9cca26b3003c2c5dfada813d6ff241396b642ededd3cb1ec2fed20a4b62bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.592378 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.621256 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI1122 02:54:43.784188 6644 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1122 02:54:43.784273 6644 factory.go:1336] Added *v1.Node event handler 7\\\\nI1122 02:54:43.784314 6644 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784615 6644 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1122 02:54:43.784613 6644 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 02:54:43.784638 6644 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 02:54:43.784651 6644 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 02:54:43.784699 6644 factory.go:656] Stopping watch factory\\\\nI1122 02:54:43.784711 6644 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1122 02:54:43.784723 6644 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784734 6644 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 02:54:43.784742 6644 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 02:54:43.784743 6644 ovnkube.go:599] Stopped ovnkube\\\\nI1122 02:54:43.784788 6644 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 02:54:43.784896 6644 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.621462 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.621491 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.621506 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.621531 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.621575 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:06Z","lastTransitionTime":"2025-11-22T02:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.655332 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.674472 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.692457 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.712280 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:55:05Z\\\",\\\"message\\\":\\\"2025-11-22T02:54:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a97ad85-d209-4196-9a77-c1f23a382ae5\\\\n2025-11-22T02:54:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a97ad85-d209-4196-9a77-c1f23a382ae5 to /host/opt/cni/bin/\\\\n2025-11-22T02:54:20Z [verbose] multus-daemon started\\\\n2025-11-22T02:54:20Z [verbose] Readiness Indicator file check\\\\n2025-11-22T02:55:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.724964 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.725025 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.725035 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.725054 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.725067 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:06Z","lastTransitionTime":"2025-11-22T02:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.725086 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.738728 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.751885 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gkngm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gkngm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.789419 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.807658 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.825391 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.827679 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.827727 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.827735 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.827754 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.827764 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:06Z","lastTransitionTime":"2025-11-22T02:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.846770 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.861803 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eabf937d41275fd8b68da6fbe05fe8fc415fe89e3fed41cee305a00750bd4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b96da655c6ec124324a850442d9bd6deffb8e111f4882435404258f0e1351f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jkv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.875721 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.891775 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.931610 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.931663 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.931677 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.931697 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:06 crc kubenswrapper[4952]: I1122 02:55:06.931710 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:06Z","lastTransitionTime":"2025-11-22T02:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.035517 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.035614 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.035627 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.035653 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.035670 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:07Z","lastTransitionTime":"2025-11-22T02:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.053892 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-j9kg2_ccedfe81-43b3-4af7-88c7-9953b33e7d13/kube-multus/0.log" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.053981 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j9kg2" event={"ID":"ccedfe81-43b3-4af7-88c7-9953b33e7d13","Type":"ContainerStarted","Data":"99a339123c6ff672532d842ec6714aa7588d6fdbc03f39380e1c715613526782"} Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.069137 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.083120 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.099442 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gkngm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gkngm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.118394 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.138385 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.138442 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.138458 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.138478 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.138491 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:07Z","lastTransitionTime":"2025-11-22T02:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.139140 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.153472 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99a339123c6ff672532d842ec6714aa7588d6fdbc03f39380e1c715613526782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:55:05Z\\\",\\\"message\\\":\\\"2025-11-22T02:54:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a97ad85-d209-4196-9a77-c1f23a382ae5\\\\n2025-11-22T02:54:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a97ad85-d209-4196-9a77-c1f23a382ae5 to /host/opt/cni/bin/\\\\n2025-11-22T02:54:20Z [verbose] multus-daemon started\\\\n2025-11-22T02:54:20Z [verbose] Readiness Indicator file check\\\\n2025-11-22T02:55:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.166638 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.179923 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eabf937d41275fd8b68da6fbe05fe8fc415fe89e3fed41cee305a00750bd4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b96da655c6ec124324a850442d9bd6deffb8e111f4882435404258f0e1351f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jkv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.202423 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.217896 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.234569 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.241932 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.241995 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.242008 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.242029 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.242040 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:07Z","lastTransitionTime":"2025-11-22T02:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.252928 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.269535 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.285320 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.307413 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI1122 02:54:43.784188 6644 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1122 02:54:43.784273 6644 factory.go:1336] Added *v1.Node event handler 7\\\\nI1122 02:54:43.784314 6644 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784615 6644 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1122 02:54:43.784613 6644 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 02:54:43.784638 6644 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 02:54:43.784651 6644 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 02:54:43.784699 6644 factory.go:656] Stopping watch factory\\\\nI1122 02:54:43.784711 6644 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1122 02:54:43.784723 6644 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784734 6644 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 02:54:43.784742 6644 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 02:54:43.784743 6644 ovnkube.go:599] Stopped ovnkube\\\\nI1122 02:54:43.784788 6644 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 02:54:43.784896 6644 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.328017 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.344489 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.344601 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.344623 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.344651 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.344673 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:07Z","lastTransitionTime":"2025-11-22T02:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.345128 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc93cee4-8fff-4eac-b7ee-9a0e550c7d62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bbb9717ab2c2be566e304ddda4cb8e43d9010f4fa4a663bbf5734fb36399f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a681e7f405c868e2932037f2542ba2aa2666f8ff23e776d0c952974398d282fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a6011947698e74ae244926c4fc492bde121b2f435911005424b9280325361a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.361851 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4db386b4-babe-4dc8-bcdc-02763c1c602b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b4302681a039e3cb3783b65bcda4bf5e2e0a03f656c79b871dd22105e1bf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af20c0e9e54a71a3edb2da902a21eac2c66032a80ac644bdb2aa89e99af10630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af20c0e9e54a71a3edb2da902a21eac2c66032a80ac644bdb2aa89e99af10630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.384363 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d9cca26b3003c2c5dfada813d6ff241396b642ededd3cb1ec2fed20a4b62bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.447835 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.447883 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.447893 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.447910 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.447922 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:07Z","lastTransitionTime":"2025-11-22T02:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.531120 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.531190 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:07 crc kubenswrapper[4952]: E1122 02:55:07.531309 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:07 crc kubenswrapper[4952]: E1122 02:55:07.531475 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.551357 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.551434 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.551463 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.551499 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.551524 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:07Z","lastTransitionTime":"2025-11-22T02:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.659283 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.659339 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.659353 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.659378 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.659392 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:07Z","lastTransitionTime":"2025-11-22T02:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.761792 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.761849 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.761858 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.761878 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.761889 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:07Z","lastTransitionTime":"2025-11-22T02:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.865502 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.865639 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.865668 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.865705 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.865728 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:07Z","lastTransitionTime":"2025-11-22T02:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.971419 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.971475 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.971485 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.971505 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:07 crc kubenswrapper[4952]: I1122 02:55:07.971518 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:07Z","lastTransitionTime":"2025-11-22T02:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.075287 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.075348 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.075363 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.075399 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.075416 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:08Z","lastTransitionTime":"2025-11-22T02:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.178309 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.178403 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.178421 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.178467 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.178501 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:08Z","lastTransitionTime":"2025-11-22T02:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.281955 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.282004 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.282014 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.282029 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.282039 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:08Z","lastTransitionTime":"2025-11-22T02:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.385791 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.385854 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.385867 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.385887 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.385902 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:08Z","lastTransitionTime":"2025-11-22T02:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.488576 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.488626 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.488639 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.488659 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.488672 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:08Z","lastTransitionTime":"2025-11-22T02:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.530433 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.530433 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:08 crc kubenswrapper[4952]: E1122 02:55:08.530609 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:08 crc kubenswrapper[4952]: E1122 02:55:08.530721 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.591534 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.591608 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.591624 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.591645 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.591665 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:08Z","lastTransitionTime":"2025-11-22T02:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.695279 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.695341 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.695355 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.695374 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.695386 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:08Z","lastTransitionTime":"2025-11-22T02:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.799001 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.799104 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.799126 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.799186 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.799205 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:08Z","lastTransitionTime":"2025-11-22T02:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.902907 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.903052 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.903123 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.903158 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:08 crc kubenswrapper[4952]: I1122 02:55:08.903221 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:08Z","lastTransitionTime":"2025-11-22T02:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.007299 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.007355 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.007367 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.007391 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.007407 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:09Z","lastTransitionTime":"2025-11-22T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.109749 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.109830 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.109853 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.109882 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.109906 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:09Z","lastTransitionTime":"2025-11-22T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.212158 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.212201 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.212216 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.212233 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.212246 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:09Z","lastTransitionTime":"2025-11-22T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.315178 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.315225 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.315234 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.315250 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.315259 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:09Z","lastTransitionTime":"2025-11-22T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.418512 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.418608 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.418628 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.418653 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.418669 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:09Z","lastTransitionTime":"2025-11-22T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.522060 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.522117 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.522133 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.522155 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.522169 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:09Z","lastTransitionTime":"2025-11-22T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.530513 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.530619 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:09 crc kubenswrapper[4952]: E1122 02:55:09.530783 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:09 crc kubenswrapper[4952]: E1122 02:55:09.530999 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.624998 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.625066 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.625084 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.625106 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.625121 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:09Z","lastTransitionTime":"2025-11-22T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.728456 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.728505 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.728516 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.728536 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.728568 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:09Z","lastTransitionTime":"2025-11-22T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.831638 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.831703 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.831724 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.831746 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.831765 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:09Z","lastTransitionTime":"2025-11-22T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.935380 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.935430 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.935446 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.935469 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:09 crc kubenswrapper[4952]: I1122 02:55:09.935484 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:09Z","lastTransitionTime":"2025-11-22T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.038782 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.038846 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.038863 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.038888 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.038903 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:10Z","lastTransitionTime":"2025-11-22T02:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.142619 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.142689 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.142701 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.142726 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.142740 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:10Z","lastTransitionTime":"2025-11-22T02:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.245473 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.245512 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.245523 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.245538 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.245573 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:10Z","lastTransitionTime":"2025-11-22T02:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.348025 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.348066 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.348078 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.348095 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.348107 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:10Z","lastTransitionTime":"2025-11-22T02:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.451062 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.451109 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.451120 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.451138 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.451150 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:10Z","lastTransitionTime":"2025-11-22T02:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.530668 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:10 crc kubenswrapper[4952]: E1122 02:55:10.530911 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.531176 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.531877 4952 scope.go:117] "RemoveContainer" containerID="5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477" Nov 22 02:55:10 crc kubenswrapper[4952]: E1122 02:55:10.531928 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.554622 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.554723 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.554749 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.554781 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.554804 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:10Z","lastTransitionTime":"2025-11-22T02:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.657951 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.658019 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.658043 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.658073 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.658094 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:10Z","lastTransitionTime":"2025-11-22T02:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.761727 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.761777 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.761810 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.761830 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.761843 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:10Z","lastTransitionTime":"2025-11-22T02:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.865042 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.865115 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.865133 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.865160 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.865178 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:10Z","lastTransitionTime":"2025-11-22T02:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.968577 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.968634 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.968654 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.968680 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:10 crc kubenswrapper[4952]: I1122 02:55:10.968700 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:10Z","lastTransitionTime":"2025-11-22T02:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.071307 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnw6b_bef051cd-2285-4b6b-a16f-1154f4d1f5dd/ovnkube-controller/2.log" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.071390 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.071433 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.071449 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.071476 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.071493 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:11Z","lastTransitionTime":"2025-11-22T02:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.075162 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerStarted","Data":"b40468bea62f7fa68ccf47a00d09353678eb85c17469c2bec98094f34f8cc3a3"} Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.077430 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.094058 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.115266 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eabf937d41275fd8b68da6fbe05fe8fc415fe89e3fed41cee305a00750bd4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b96da655c6ec124324a850442d9bd6deffb8e111f4882435404258f0e1351f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jkv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.137940 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.156176 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.172302 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.174063 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.174103 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.174118 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.174135 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.174156 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:11Z","lastTransitionTime":"2025-11-22T02:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.189486 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.205677 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.221094 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.245749 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b40468bea62f7fa68ccf47a00d09353678eb85c17469c2bec98094f34f8cc3a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI1122 02:54:43.784188 6644 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1122 02:54:43.784273 6644 factory.go:1336] Added *v1.Node event handler 7\\\\nI1122 02:54:43.784314 6644 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784615 6644 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1122 02:54:43.784613 6644 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 02:54:43.784638 6644 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 02:54:43.784651 6644 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 02:54:43.784699 6644 factory.go:656] Stopping watch factory\\\\nI1122 02:54:43.784711 6644 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1122 02:54:43.784723 6644 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784734 6644 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 02:54:43.784742 6644 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 02:54:43.784743 6644 ovnkube.go:599] Stopped ovnkube\\\\nI1122 02:54:43.784788 6644 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 02:54:43.784896 6644 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.271351 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.275887 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.275918 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.275927 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.275942 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.275953 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:11Z","lastTransitionTime":"2025-11-22T02:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.287589 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc93cee4-8fff-4eac-b7ee-9a0e550c7d62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bbb9717ab2c2be566e304ddda4cb8e43d9010f4fa4a663bbf5734fb36399f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a681e7f405c868e2932037f2542ba2aa2666f8ff23e776d0c952974398d282fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a6011947698e74ae244926c4fc492bde121b2f435911005424b9280325361a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.299075 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4db386b4-babe-4dc8-bcdc-02763c1c602b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b4302681a039e3cb3783b65bcda4bf5e2e0a03f656c79b871dd22105e1bf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af20c0e9e54a71a3edb2da902a21eac2c66032a80ac644bdb2aa89e99af10630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af20c0e9e54a71a3edb2da902a21eac2c66032a80ac644bdb2aa89e99af10630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.324579 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d9cca26b3003c2c5dfada813d6ff241396b642ededd3cb1ec2fed20a4b62bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.336757 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.350053 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.363508 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gkngm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gkngm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.379235 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.379286 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.379298 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.379319 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.379332 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:11Z","lastTransitionTime":"2025-11-22T02:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.380268 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.400668 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.420569 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99a339123c6ff672532d842ec6714aa7588d6fdbc03f39380e1c715613526782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:55:05Z\\\",\\\"message\\\":\\\"2025-11-22T02:54:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a97ad85-d209-4196-9a77-c1f23a382ae5\\\\n2025-11-22T02:54:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a97ad85-d209-4196-9a77-c1f23a382ae5 to /host/opt/cni/bin/\\\\n2025-11-22T02:54:20Z [verbose] multus-daemon started\\\\n2025-11-22T02:54:20Z [verbose] Readiness Indicator file check\\\\n2025-11-22T02:55:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.482274 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.482334 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.482351 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.482373 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.482390 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:11Z","lastTransitionTime":"2025-11-22T02:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.531093 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.531252 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:11 crc kubenswrapper[4952]: E1122 02:55:11.531364 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:11 crc kubenswrapper[4952]: E1122 02:55:11.531490 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.585894 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.585977 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.586000 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.586029 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.586050 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:11Z","lastTransitionTime":"2025-11-22T02:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.689234 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.689302 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.689316 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.689341 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.689358 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:11Z","lastTransitionTime":"2025-11-22T02:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.792416 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.792514 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.792535 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.792595 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.792621 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:11Z","lastTransitionTime":"2025-11-22T02:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.896687 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.896773 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.896802 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.896838 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:11 crc kubenswrapper[4952]: I1122 02:55:11.896859 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:11Z","lastTransitionTime":"2025-11-22T02:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:11.999930 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.000014 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.000040 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.000074 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.000099 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:12Z","lastTransitionTime":"2025-11-22T02:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.081701 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnw6b_bef051cd-2285-4b6b-a16f-1154f4d1f5dd/ovnkube-controller/3.log" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.083000 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnw6b_bef051cd-2285-4b6b-a16f-1154f4d1f5dd/ovnkube-controller/2.log" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.087780 4952 generic.go:334] "Generic (PLEG): container finished" podID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerID="b40468bea62f7fa68ccf47a00d09353678eb85c17469c2bec98094f34f8cc3a3" exitCode=1 Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.087853 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerDied","Data":"b40468bea62f7fa68ccf47a00d09353678eb85c17469c2bec98094f34f8cc3a3"} Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.087936 4952 scope.go:117] "RemoveContainer" containerID="5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.090174 4952 scope.go:117] "RemoveContainer" containerID="b40468bea62f7fa68ccf47a00d09353678eb85c17469c2bec98094f34f8cc3a3" Nov 22 02:55:12 crc kubenswrapper[4952]: E1122 02:55:12.090739 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.102603 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.102810 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.102835 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.102869 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.102888 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:12Z","lastTransitionTime":"2025-11-22T02:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.117229 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc93cee4-8fff-4eac-b7ee-9a0e550c7d62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bbb9717ab2c2be566e304ddda4cb8e43d9010f4fa4a663bbf5734fb36399f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a681e7f405c868e2932037f2542ba2aa2666f8ff23e776d0c952974398d282fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a6011947698e74ae244926c4fc492bde121b2f435911005424b9280325361a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.134085 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4db386b4-babe-4dc8-bcdc-02763c1c602b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b4302681a039e3cb3783b65bcda4bf5e2e0a03f656c79b871dd22105e1bf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af20c0e9e54a71a3edb2da902a21eac2c66032a80ac644bdb2aa89e99af10630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af20c0e9e54a71a3edb2da902a21eac2c66032a80ac644bdb2aa89e99af10630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.156584 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d9cca26b3003c2c5dfada813d6ff241396b642ededd3cb1ec2fed20a4b62bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.178756 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.206888 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.206974 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.206991 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.207014 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.207031 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:12Z","lastTransitionTime":"2025-11-22T02:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.210119 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b40468bea62f7fa68ccf47a00d09353678eb85c17469c2bec98094f34f8cc3a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fc297afce5bdf4831d8dec9ad52eb698ffe877659746349dfd1f4b5e72d0477\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:43Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI1122 02:54:43.784188 6644 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1122 02:54:43.784273 6644 factory.go:1336] Added *v1.Node event handler 7\\\\nI1122 02:54:43.784314 6644 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784615 6644 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1122 02:54:43.784613 6644 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 02:54:43.784638 6644 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 02:54:43.784651 6644 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 02:54:43.784699 6644 factory.go:656] Stopping watch factory\\\\nI1122 02:54:43.784711 6644 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1122 02:54:43.784723 6644 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 02:54:43.784734 6644 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 02:54:43.784742 6644 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 02:54:43.784743 6644 ovnkube.go:599] Stopped ovnkube\\\\nI1122 02:54:43.784788 6644 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 02:54:43.784896 6644 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40468bea62f7fa68ccf47a00d09353678eb85c17469c2bec98094f34f8cc3a3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:55:11Z\\\",\\\"message\\\":\\\"to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z]\\\\nI1122 02:55:11.514755 7010 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"d7d7b270-1480-47f8-bdf9-690dbab310cb\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]strin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.233428 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.255130 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.276214 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.337256 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.337313 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.337328 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.337350 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.337366 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:12Z","lastTransitionTime":"2025-11-22T02:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.344566 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99a339123c6ff672532d842ec6714aa7588d6fdbc03f39380e1c715613526782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:55:05Z\\\",\\\"message\\\":\\\"2025-11-22T02:54:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a97ad85-d209-4196-9a77-c1f23a382ae5\\\\n2025-11-22T02:54:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a97ad85-d209-4196-9a77-c1f23a382ae5 to /host/opt/cni/bin/\\\\n2025-11-22T02:54:20Z [verbose] multus-daemon started\\\\n2025-11-22T02:54:20Z [verbose] Readiness Indicator file check\\\\n2025-11-22T02:55:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.360425 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.376681 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.391315 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gkngm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gkngm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.416627 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.437829 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.441509 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.441580 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.441590 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.441608 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.441621 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:12Z","lastTransitionTime":"2025-11-22T02:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.466917 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.484533 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.503612 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eabf937d41275fd8b68da6fbe05fe8fc415fe89e3fed41cee305a00750bd4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b96da655c6ec124324a850442d9bd6deffb8e111f4882435404258f0e1351f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jkv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.521534 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.530869 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:12 crc kubenswrapper[4952]: E1122 02:55:12.531146 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.531235 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:12 crc kubenswrapper[4952]: E1122 02:55:12.531428 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.542079 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.544215 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.544277 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.544296 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.544325 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.544345 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:12Z","lastTransitionTime":"2025-11-22T02:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.648714 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.648807 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.648838 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.648875 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.648900 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:12Z","lastTransitionTime":"2025-11-22T02:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.751897 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.751955 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.751972 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.751996 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.752015 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:12Z","lastTransitionTime":"2025-11-22T02:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.855303 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.855385 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.855410 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.855487 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.855526 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:12Z","lastTransitionTime":"2025-11-22T02:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.959195 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.959253 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.959275 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.959303 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:12 crc kubenswrapper[4952]: I1122 02:55:12.959320 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:12Z","lastTransitionTime":"2025-11-22T02:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.063782 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.063898 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.063928 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.063959 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.063979 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:13Z","lastTransitionTime":"2025-11-22T02:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.097250 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnw6b_bef051cd-2285-4b6b-a16f-1154f4d1f5dd/ovnkube-controller/3.log" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.104229 4952 scope.go:117] "RemoveContainer" containerID="b40468bea62f7fa68ccf47a00d09353678eb85c17469c2bec98094f34f8cc3a3" Nov 22 02:55:13 crc kubenswrapper[4952]: E1122 02:55:13.104644 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.138087 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.161061 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.167189 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.167228 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.167251 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.167283 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.167308 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:13Z","lastTransitionTime":"2025-11-22T02:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.182910 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.201017 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.218468 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eabf937d41275fd8b68da6fbe05fe8fc415fe89e3fed41cee305a00750bd4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b96da655c6ec124324a850442d9bd6deffb8e111f4882435404258f0e1351f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jkv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.239116 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.259436 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.271241 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.271311 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.271330 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.271353 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.271371 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:13Z","lastTransitionTime":"2025-11-22T02:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.282481 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.301731 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc93cee4-8fff-4eac-b7ee-9a0e550c7d62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bbb9717ab2c2be566e304ddda4cb8e43d9010f4fa4a663bbf5734fb36399f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a681e7f405c868e2932037f2542ba2aa2666f8ff23e776d0c952974398d282fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a6011947698e74ae244926c4fc492bde121b2f435911005424b9280325361a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.317881 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4db386b4-babe-4dc8-bcdc-02763c1c602b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b4302681a039e3cb3783b65bcda4bf5e2e0a03f656c79b871dd22105e1bf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af20c0e9e54a71a3edb2da902a21eac2c66032a80ac644bdb2aa89e99af10630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af20c0e9e54a71a3edb2da902a21eac2c66032a80ac644bdb2aa89e99af10630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.341929 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d9cca26b3003c2c5dfada813d6ff241396b642ededd3cb1ec2fed20a4b62bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.362287 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.374878 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.374999 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.375021 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.375051 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.375074 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:13Z","lastTransitionTime":"2025-11-22T02:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.396848 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b40468bea62f7fa68ccf47a00d09353678eb85c17469c2bec98094f34f8cc3a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40468bea62f7fa68ccf47a00d09353678eb85c17469c2bec98094f34f8cc3a3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:55:11Z\\\",\\\"message\\\":\\\"to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z]\\\\nI1122 02:55:11.514755 7010 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"d7d7b270-1480-47f8-bdf9-690dbab310cb\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]strin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:55:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.412790 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gkngm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gkngm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.434798 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.457268 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.478317 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.478397 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.478429 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.478482 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.478511 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:13Z","lastTransitionTime":"2025-11-22T02:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.481871 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99a339123c6ff672532d842ec6714aa7588d6fdbc03f39380e1c715613526782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:55:05Z\\\",\\\"message\\\":\\\"2025-11-22T02:54:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a97ad85-d209-4196-9a77-c1f23a382ae5\\\\n2025-11-22T02:54:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a97ad85-d209-4196-9a77-c1f23a382ae5 to /host/opt/cni/bin/\\\\n2025-11-22T02:54:20Z [verbose] multus-daemon started\\\\n2025-11-22T02:54:20Z [verbose] Readiness Indicator file check\\\\n2025-11-22T02:55:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.498998 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.516990 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.530071 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.530191 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:13 crc kubenswrapper[4952]: E1122 02:55:13.530254 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:13 crc kubenswrapper[4952]: E1122 02:55:13.530475 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.581960 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.582020 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.582033 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.582052 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.582065 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:13Z","lastTransitionTime":"2025-11-22T02:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.686660 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.686731 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.686762 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.686788 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.686803 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:13Z","lastTransitionTime":"2025-11-22T02:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.790179 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.790245 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.790264 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.790289 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.790307 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:13Z","lastTransitionTime":"2025-11-22T02:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.893011 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.893091 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.893111 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.893142 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.893163 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:13Z","lastTransitionTime":"2025-11-22T02:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.996258 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.996318 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.996339 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.996364 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:13 crc kubenswrapper[4952]: I1122 02:55:13.996384 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:13Z","lastTransitionTime":"2025-11-22T02:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.079608 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.079680 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.079697 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.079725 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.079748 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:14Z","lastTransitionTime":"2025-11-22T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:14 crc kubenswrapper[4952]: E1122 02:55:14.101211 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.107466 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.107536 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.107590 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.107621 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.107637 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:14Z","lastTransitionTime":"2025-11-22T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:14 crc kubenswrapper[4952]: E1122 02:55:14.127612 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.134001 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.134054 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.134068 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.134089 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.134102 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:14Z","lastTransitionTime":"2025-11-22T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:14 crc kubenswrapper[4952]: E1122 02:55:14.151310 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.157696 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.157805 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.157823 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.157853 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.157872 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:14Z","lastTransitionTime":"2025-11-22T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:14 crc kubenswrapper[4952]: E1122 02:55:14.175616 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.180263 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.180342 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.180364 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.180396 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.180417 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:14Z","lastTransitionTime":"2025-11-22T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:14 crc kubenswrapper[4952]: E1122 02:55:14.199885 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:14 crc kubenswrapper[4952]: E1122 02:55:14.200143 4952 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.203167 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.203241 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.203265 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.203297 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.203321 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:14Z","lastTransitionTime":"2025-11-22T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.307785 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.307877 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.307907 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.307946 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.307967 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:14Z","lastTransitionTime":"2025-11-22T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.411986 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.412038 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.412051 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.412073 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.412085 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:14Z","lastTransitionTime":"2025-11-22T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.515993 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.516107 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.516130 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.516160 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.516181 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:14Z","lastTransitionTime":"2025-11-22T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.558969 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.559072 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.559106 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:14 crc kubenswrapper[4952]: E1122 02:55:14.559270 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:14 crc kubenswrapper[4952]: E1122 02:55:14.559469 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:14 crc kubenswrapper[4952]: E1122 02:55:14.559714 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.619403 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.619508 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.619527 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.619616 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.619646 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:14Z","lastTransitionTime":"2025-11-22T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.723134 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.723243 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.723262 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.723292 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.723312 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:14Z","lastTransitionTime":"2025-11-22T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.826662 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.826732 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.826756 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.826821 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.826842 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:14Z","lastTransitionTime":"2025-11-22T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.930337 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.930452 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.930504 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.930534 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:14 crc kubenswrapper[4952]: I1122 02:55:14.930596 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:14Z","lastTransitionTime":"2025-11-22T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.034058 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.034136 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.034149 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.034179 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.034194 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:15Z","lastTransitionTime":"2025-11-22T02:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.138772 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.138849 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.138869 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.138898 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.138919 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:15Z","lastTransitionTime":"2025-11-22T02:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.242243 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.242311 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.242335 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.242365 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.242386 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:15Z","lastTransitionTime":"2025-11-22T02:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.345418 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.345477 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.345495 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.345518 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.345534 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:15Z","lastTransitionTime":"2025-11-22T02:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.450051 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.450138 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.450161 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.450190 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.450212 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:15Z","lastTransitionTime":"2025-11-22T02:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.530441 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:15 crc kubenswrapper[4952]: E1122 02:55:15.530690 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.554435 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.554487 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.554504 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.554528 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.554582 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:15Z","lastTransitionTime":"2025-11-22T02:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.658462 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.658592 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.658612 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.658636 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.658653 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:15Z","lastTransitionTime":"2025-11-22T02:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.762828 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.762901 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.762922 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.762945 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.762965 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:15Z","lastTransitionTime":"2025-11-22T02:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.866168 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.866246 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.866270 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.866299 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.866321 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:15Z","lastTransitionTime":"2025-11-22T02:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.969949 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.970002 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.970022 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.970046 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:15 crc kubenswrapper[4952]: I1122 02:55:15.970063 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:15Z","lastTransitionTime":"2025-11-22T02:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.073902 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.073964 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.073982 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.074011 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.074033 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:16Z","lastTransitionTime":"2025-11-22T02:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.177160 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.177229 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.177246 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.177272 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.177291 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:16Z","lastTransitionTime":"2025-11-22T02:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.280751 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.280818 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.280840 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.280867 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.280885 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:16Z","lastTransitionTime":"2025-11-22T02:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.384758 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.384831 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.384849 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.384879 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.384897 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:16Z","lastTransitionTime":"2025-11-22T02:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.488948 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.489028 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.489046 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.489071 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.489091 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:16Z","lastTransitionTime":"2025-11-22T02:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.530467 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:16 crc kubenswrapper[4952]: E1122 02:55:16.530722 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.530812 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.531038 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:16 crc kubenswrapper[4952]: E1122 02:55:16.531032 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:16 crc kubenswrapper[4952]: E1122 02:55:16.531138 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.563681 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b40468bea62f7fa68ccf47a00d09353678eb85c17469c2bec98094f34f8cc3a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40468bea62f7fa68ccf47a00d09353678eb85c17469c2bec98094f34f8cc3a3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:55:11Z\\\",\\\"message\\\":\\\"to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z]\\\\nI1122 02:55:11.514755 7010 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"d7d7b270-1480-47f8-bdf9-690dbab310cb\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]strin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:55:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.584824 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.591262 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.591496 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.591726 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.591942 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.592456 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:16Z","lastTransitionTime":"2025-11-22T02:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.602691 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc93cee4-8fff-4eac-b7ee-9a0e550c7d62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bbb9717ab2c2be566e304ddda4cb8e43d9010f4fa4a663bbf5734fb36399f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a681e7f405c868e2932037f2542ba2aa2666f8ff23e776d0c952974398d282fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a6011947698e74ae244926c4fc492bde121b2f435911005424b9280325361a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.617975 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4db386b4-babe-4dc8-bcdc-02763c1c602b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b4302681a039e3cb3783b65bcda4bf5e2e0a03f656c79b871dd22105e1bf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af20c0e9e54a71a3edb2da902a21eac2c66032a80ac644bdb2aa89e99af10630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af20c0e9e54a71a3edb2da902a21eac2c66032a80ac644bdb2aa89e99af10630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.641605 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d9cca26b3003c2c5dfada813d6ff241396b642ededd3cb1ec2fed20a4b62bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.656361 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.669500 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.680663 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gkngm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gkngm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.695953 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.696019 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.696042 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.696076 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.696101 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:16Z","lastTransitionTime":"2025-11-22T02:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.699328 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.719972 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.738516 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99a339123c6ff672532d842ec6714aa7588d6fdbc03f39380e1c715613526782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:55:05Z\\\",\\\"message\\\":\\\"2025-11-22T02:54:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a97ad85-d209-4196-9a77-c1f23a382ae5\\\\n2025-11-22T02:54:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a97ad85-d209-4196-9a77-c1f23a382ae5 to /host/opt/cni/bin/\\\\n2025-11-22T02:54:20Z [verbose] multus-daemon started\\\\n2025-11-22T02:54:20Z [verbose] Readiness Indicator file check\\\\n2025-11-22T02:55:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.753279 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.775481 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eabf937d41275fd8b68da6fbe05fe8fc415fe89e3fed41cee305a00750bd4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b96da655c6ec124324a850442d9bd6deffb8e111f4882435404258f0e1351f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jkv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.799964 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.800416 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.801808 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.801943 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.802094 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:16Z","lastTransitionTime":"2025-11-22T02:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.803103 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.823472 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.843263 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.861121 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.881757 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.900044 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.905474 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.905524 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.905578 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.905603 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:16 crc kubenswrapper[4952]: I1122 02:55:16.905615 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:16Z","lastTransitionTime":"2025-11-22T02:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.008652 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.008714 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.008788 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.008817 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.008909 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:17Z","lastTransitionTime":"2025-11-22T02:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.111964 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.112042 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.112060 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.112090 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.112109 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:17Z","lastTransitionTime":"2025-11-22T02:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.214118 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.214243 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.214260 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.214285 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.214306 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:17Z","lastTransitionTime":"2025-11-22T02:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.317783 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.317968 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.317992 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.318019 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.318078 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:17Z","lastTransitionTime":"2025-11-22T02:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.421617 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.421714 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.421734 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.421757 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.421774 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:17Z","lastTransitionTime":"2025-11-22T02:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.526004 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.526082 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.526108 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.526144 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.526161 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:17Z","lastTransitionTime":"2025-11-22T02:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.530390 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:17 crc kubenswrapper[4952]: E1122 02:55:17.530634 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.629863 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.629933 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.629956 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.629986 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.630004 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:17Z","lastTransitionTime":"2025-11-22T02:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.734000 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.734063 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.734139 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.734167 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.734184 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:17Z","lastTransitionTime":"2025-11-22T02:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.839505 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.839656 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.839764 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.839835 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.839852 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:17Z","lastTransitionTime":"2025-11-22T02:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.943530 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.943654 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.943673 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.943700 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:17 crc kubenswrapper[4952]: I1122 02:55:17.943719 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:17Z","lastTransitionTime":"2025-11-22T02:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.047193 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.047261 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.047281 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.047308 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.047327 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:18Z","lastTransitionTime":"2025-11-22T02:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.151247 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.151431 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.151460 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.151491 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.151512 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:18Z","lastTransitionTime":"2025-11-22T02:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.254807 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.254891 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.254913 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.254943 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.254970 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:18Z","lastTransitionTime":"2025-11-22T02:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.359003 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.359111 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.359130 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.359156 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.359175 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:18Z","lastTransitionTime":"2025-11-22T02:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.462054 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.462691 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.462729 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.462759 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.462777 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:18Z","lastTransitionTime":"2025-11-22T02:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.530411 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.530478 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:18 crc kubenswrapper[4952]: E1122 02:55:18.530657 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.530731 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:18 crc kubenswrapper[4952]: E1122 02:55:18.530868 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:18 crc kubenswrapper[4952]: E1122 02:55:18.531096 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.565676 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.565771 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.565797 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.565827 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.565848 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:18Z","lastTransitionTime":"2025-11-22T02:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.669135 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.669209 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.669230 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.669255 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.669273 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:18Z","lastTransitionTime":"2025-11-22T02:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.771975 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.772094 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.772113 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.772144 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.772164 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:18Z","lastTransitionTime":"2025-11-22T02:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.875619 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.875706 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.875728 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.875756 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.875777 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:18Z","lastTransitionTime":"2025-11-22T02:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.979346 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.979492 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.979692 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.979800 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:18 crc kubenswrapper[4952]: I1122 02:55:18.979821 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:18Z","lastTransitionTime":"2025-11-22T02:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.083928 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.083997 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.084014 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.084043 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.084060 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:19Z","lastTransitionTime":"2025-11-22T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.187760 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.187838 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.187858 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.187887 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.187909 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:19Z","lastTransitionTime":"2025-11-22T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.291680 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.291757 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.291778 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.291807 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.291827 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:19Z","lastTransitionTime":"2025-11-22T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.394618 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.394737 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.394761 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.394826 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.394844 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:19Z","lastTransitionTime":"2025-11-22T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.499385 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.499454 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.499474 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.499501 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.499519 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:19Z","lastTransitionTime":"2025-11-22T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.531087 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:19 crc kubenswrapper[4952]: E1122 02:55:19.531345 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.602680 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.602748 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.602764 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.602796 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.602810 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:19Z","lastTransitionTime":"2025-11-22T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.706512 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.706612 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.706626 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.706651 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.706667 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:19Z","lastTransitionTime":"2025-11-22T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.810566 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.810630 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.810648 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.810674 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.810692 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:19Z","lastTransitionTime":"2025-11-22T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.914805 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.914880 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.914904 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.914938 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:19 crc kubenswrapper[4952]: I1122 02:55:19.914966 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:19Z","lastTransitionTime":"2025-11-22T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.018874 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.018974 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.018994 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.019019 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.019035 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:20Z","lastTransitionTime":"2025-11-22T02:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.122939 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.123010 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.123030 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.123056 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.123074 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:20Z","lastTransitionTime":"2025-11-22T02:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.226896 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.226956 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.226978 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.227008 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.227028 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:20Z","lastTransitionTime":"2025-11-22T02:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.330169 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.330258 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.330271 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.330290 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.330303 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:20Z","lastTransitionTime":"2025-11-22T02:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.330398 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.330671 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:20 crc kubenswrapper[4952]: E1122 02:55:20.330684 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:24.330658248 +0000 UTC m=+148.636675521 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.330739 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:20 crc kubenswrapper[4952]: E1122 02:55:20.330823 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:55:20 crc kubenswrapper[4952]: E1122 02:55:20.330845 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:55:20 crc kubenswrapper[4952]: E1122 02:55:20.330858 4952 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:55:20 crc kubenswrapper[4952]: E1122 02:55:20.330896 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 02:56:24.330887935 +0000 UTC m=+148.636905208 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:55:20 crc kubenswrapper[4952]: E1122 02:55:20.330906 4952 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:55:20 crc kubenswrapper[4952]: E1122 02:55:20.330981 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:56:24.330959556 +0000 UTC m=+148.636976869 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.330821 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.331078 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:20 crc kubenswrapper[4952]: E1122 02:55:20.331202 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:55:20 crc kubenswrapper[4952]: E1122 02:55:20.331255 4952 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:55:20 crc kubenswrapper[4952]: E1122 02:55:20.331281 4952 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:55:20 crc kubenswrapper[4952]: E1122 02:55:20.331421 4952 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:55:20 crc kubenswrapper[4952]: E1122 02:55:20.331486 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 02:56:24.331387327 +0000 UTC m=+148.637404640 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:55:20 crc kubenswrapper[4952]: E1122 02:55:20.331651 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:56:24.331590642 +0000 UTC m=+148.637607925 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.432675 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.432759 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.432776 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.432794 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.432806 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:20Z","lastTransitionTime":"2025-11-22T02:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.531092 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.531114 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:20 crc kubenswrapper[4952]: E1122 02:55:20.531335 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:20 crc kubenswrapper[4952]: E1122 02:55:20.531603 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.532830 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:20 crc kubenswrapper[4952]: E1122 02:55:20.533028 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.537222 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.537276 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.537294 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.537321 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.537398 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:20Z","lastTransitionTime":"2025-11-22T02:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.641023 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.641102 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.641129 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.641164 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.641193 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:20Z","lastTransitionTime":"2025-11-22T02:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.744283 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.744340 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.744357 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.744379 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.744397 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:20Z","lastTransitionTime":"2025-11-22T02:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.853506 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.853632 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.853658 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.853692 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.853716 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:20Z","lastTransitionTime":"2025-11-22T02:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.958135 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.958192 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.958211 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.958234 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:20 crc kubenswrapper[4952]: I1122 02:55:20.958251 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:20Z","lastTransitionTime":"2025-11-22T02:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.062075 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.062128 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.062137 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.062154 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.062164 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:21Z","lastTransitionTime":"2025-11-22T02:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.165249 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.165304 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.165318 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.165337 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.165349 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:21Z","lastTransitionTime":"2025-11-22T02:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.268715 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.268803 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.268829 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.268863 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.268888 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:21Z","lastTransitionTime":"2025-11-22T02:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.371747 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.371856 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.371869 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.371888 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.371900 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:21Z","lastTransitionTime":"2025-11-22T02:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.474328 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.474382 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.474405 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.474425 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.474437 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:21Z","lastTransitionTime":"2025-11-22T02:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.530750 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:21 crc kubenswrapper[4952]: E1122 02:55:21.531064 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.579745 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.579819 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.579839 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.579912 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.579935 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:21Z","lastTransitionTime":"2025-11-22T02:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.683685 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.683754 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.683777 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.683805 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.683825 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:21Z","lastTransitionTime":"2025-11-22T02:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.787245 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.787311 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.787326 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.787347 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.787363 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:21Z","lastTransitionTime":"2025-11-22T02:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.890757 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.890884 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.890904 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.890931 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.890951 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:21Z","lastTransitionTime":"2025-11-22T02:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.995921 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.996008 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.996061 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.996088 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:21 crc kubenswrapper[4952]: I1122 02:55:21.996106 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:21Z","lastTransitionTime":"2025-11-22T02:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.099771 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.099819 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.099837 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.099902 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.099924 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:22Z","lastTransitionTime":"2025-11-22T02:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.203141 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.203198 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.203217 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.203244 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.203262 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:22Z","lastTransitionTime":"2025-11-22T02:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.306232 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.306295 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.306315 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.306340 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.306356 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:22Z","lastTransitionTime":"2025-11-22T02:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.409835 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.409894 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.409909 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.409926 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.409941 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:22Z","lastTransitionTime":"2025-11-22T02:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.512758 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.513174 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.513183 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.513199 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.513209 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:22Z","lastTransitionTime":"2025-11-22T02:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.530297 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.530359 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.530290 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:22 crc kubenswrapper[4952]: E1122 02:55:22.530443 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:22 crc kubenswrapper[4952]: E1122 02:55:22.530621 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:22 crc kubenswrapper[4952]: E1122 02:55:22.530709 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.616683 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.616748 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.616767 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.616800 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.616819 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:22Z","lastTransitionTime":"2025-11-22T02:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.719986 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.720044 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.720056 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.720077 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.720092 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:22Z","lastTransitionTime":"2025-11-22T02:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.822793 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.822859 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.822870 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.822889 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.822905 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:22Z","lastTransitionTime":"2025-11-22T02:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.925776 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.925855 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.925873 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.925900 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:22 crc kubenswrapper[4952]: I1122 02:55:22.925918 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:22Z","lastTransitionTime":"2025-11-22T02:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.028743 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.028803 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.028821 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.028850 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.028874 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:23Z","lastTransitionTime":"2025-11-22T02:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.132887 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.132949 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.132959 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.132976 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.132987 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:23Z","lastTransitionTime":"2025-11-22T02:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.237055 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.237146 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.237164 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.237211 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.237226 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:23Z","lastTransitionTime":"2025-11-22T02:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.341149 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.341209 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.341224 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.341254 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.341274 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:23Z","lastTransitionTime":"2025-11-22T02:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.444303 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.444357 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.444372 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.444393 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.444409 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:23Z","lastTransitionTime":"2025-11-22T02:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.530463 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:23 crc kubenswrapper[4952]: E1122 02:55:23.530687 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.532417 4952 scope.go:117] "RemoveContainer" containerID="b40468bea62f7fa68ccf47a00d09353678eb85c17469c2bec98094f34f8cc3a3" Nov 22 02:55:23 crc kubenswrapper[4952]: E1122 02:55:23.532780 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.547222 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.547293 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.547313 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.547341 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.547364 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:23Z","lastTransitionTime":"2025-11-22T02:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.651189 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.651257 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.651268 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.651289 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.651304 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:23Z","lastTransitionTime":"2025-11-22T02:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.755010 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.755080 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.755098 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.755120 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.755134 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:23Z","lastTransitionTime":"2025-11-22T02:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.859304 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.859361 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.859375 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.859396 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.859415 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:23Z","lastTransitionTime":"2025-11-22T02:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.962899 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.962984 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.962999 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.963027 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:23 crc kubenswrapper[4952]: I1122 02:55:23.963046 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:23Z","lastTransitionTime":"2025-11-22T02:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.065358 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.065409 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.065422 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.065438 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.065448 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:24Z","lastTransitionTime":"2025-11-22T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.169130 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.169182 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.169192 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.169209 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.169220 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:24Z","lastTransitionTime":"2025-11-22T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.271359 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.271428 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.271450 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.271475 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.271495 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:24Z","lastTransitionTime":"2025-11-22T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.374975 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.375035 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.375053 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.375078 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.375097 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:24Z","lastTransitionTime":"2025-11-22T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.478335 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.478419 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.478430 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.478449 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.478461 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:24Z","lastTransitionTime":"2025-11-22T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.530840 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.530928 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.530969 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:24 crc kubenswrapper[4952]: E1122 02:55:24.531096 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:24 crc kubenswrapper[4952]: E1122 02:55:24.531214 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:24 crc kubenswrapper[4952]: E1122 02:55:24.531349 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.555835 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.555924 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.555949 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.555983 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.556007 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:24Z","lastTransitionTime":"2025-11-22T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:24 crc kubenswrapper[4952]: E1122 02:55:24.572093 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.576282 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.576355 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.576371 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.576391 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.576409 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:24Z","lastTransitionTime":"2025-11-22T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:24 crc kubenswrapper[4952]: E1122 02:55:24.591415 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.596240 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.596341 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.596361 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.596387 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.596408 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:24Z","lastTransitionTime":"2025-11-22T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:24 crc kubenswrapper[4952]: E1122 02:55:24.610286 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.618836 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.618898 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.618924 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.618951 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.618968 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:24Z","lastTransitionTime":"2025-11-22T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:24 crc kubenswrapper[4952]: E1122 02:55:24.638612 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.643481 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.643577 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.643598 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.643680 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.643704 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:24Z","lastTransitionTime":"2025-11-22T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:24 crc kubenswrapper[4952]: E1122 02:55:24.662786 4952 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"77d94e3a-20e5-45ab-9435-01440651fcdb\\\",\\\"systemUUID\\\":\\\"2d0f1a1c-2ee1-4b37-849e-8151c669da05\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:24 crc kubenswrapper[4952]: E1122 02:55:24.663066 4952 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.665825 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.665869 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.665887 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.665909 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.665926 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:24Z","lastTransitionTime":"2025-11-22T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.769482 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.769583 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.769605 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.769629 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.769645 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:24Z","lastTransitionTime":"2025-11-22T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.873013 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.873143 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.873173 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.873197 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.873215 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:24Z","lastTransitionTime":"2025-11-22T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.976178 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.976233 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.976245 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.976268 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:24 crc kubenswrapper[4952]: I1122 02:55:24.976285 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:24Z","lastTransitionTime":"2025-11-22T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.079235 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.079320 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.079340 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.079364 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.079379 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:25Z","lastTransitionTime":"2025-11-22T02:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.182450 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.182499 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.182515 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.182539 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.182574 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:25Z","lastTransitionTime":"2025-11-22T02:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.285664 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.285741 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.285758 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.285789 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.285810 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:25Z","lastTransitionTime":"2025-11-22T02:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.388086 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.388124 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.388133 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.388148 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.388158 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:25Z","lastTransitionTime":"2025-11-22T02:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.491610 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.491676 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.491695 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.491724 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.491744 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:25Z","lastTransitionTime":"2025-11-22T02:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.530795 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:25 crc kubenswrapper[4952]: E1122 02:55:25.531038 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.594937 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.595032 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.595055 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.595091 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.595119 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:25Z","lastTransitionTime":"2025-11-22T02:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.699361 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.699515 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.699534 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.699609 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.699636 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:25Z","lastTransitionTime":"2025-11-22T02:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.802338 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.802398 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.802412 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.802442 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.802457 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:25Z","lastTransitionTime":"2025-11-22T02:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.906262 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.906328 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.906345 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.906370 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:25 crc kubenswrapper[4952]: I1122 02:55:25.906388 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:25Z","lastTransitionTime":"2025-11-22T02:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.009770 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.009821 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.009833 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.009853 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.009869 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:26Z","lastTransitionTime":"2025-11-22T02:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.114112 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.114215 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.114227 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.114248 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.114264 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:26Z","lastTransitionTime":"2025-11-22T02:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.217037 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.217198 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.217212 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.217256 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.217271 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:26Z","lastTransitionTime":"2025-11-22T02:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.320015 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.320055 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.320066 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.320084 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.320096 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:26Z","lastTransitionTime":"2025-11-22T02:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.423367 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.423421 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.423438 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.423459 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.423473 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:26Z","lastTransitionTime":"2025-11-22T02:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.527497 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.527641 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.527669 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.527702 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.527726 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:26Z","lastTransitionTime":"2025-11-22T02:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.530961 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.531034 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.531050 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:26 crc kubenswrapper[4952]: E1122 02:55:26.531209 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:26 crc kubenswrapper[4952]: E1122 02:55:26.531324 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:26 crc kubenswrapper[4952]: E1122 02:55:26.531635 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.553734 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://707eefde9801af59ee1d20cbe88c4d6f9976b3e13be46d7fdec1cb367ba4a395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a412934c4d0a59011676c5f489e9c2cc8e8c0e8ba7c6d76f170b836e5ba2a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.570248 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f311d8-e9ac-4dd7-bc2c-321490681934\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf79eacb5951b199a7bd5a0c9689a4c5e85060110330da1d8ba29a7c192b7779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmt28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn2dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.586174 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f1869ba-6fff-4b0d-9e45-1e2aac293caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eabf937d41275fd8b68da6fbe05fe8fc415fe89e3fed41cee305a00750bd4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b96da655c6ec124324a850442d9bd6deffb8e111f4882435404258f0e1351f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh6x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jkv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.619208 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c9d3264-a70e-4370-bc13-faad83c3b0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://772557a0662d51ad3e25fe23ca06fa285e8fae9947b27d0aea9ff31c774f5382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03caa8654357147237bcc4ecd9b1ea58f455c1fc769989874f890aa271e566e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://183d262ec0f526bf3bdfa529f338eee26ad88b6e4e697cf2b05ffe991767b72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d803c1410d7f6948e260107fe2b7af193d96e470c8a7552c43f18ab24d00c3d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645b140513def5097100f1d87f4e0b8c76426927c1ab36a990a200d2e48a944c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652b97be0740d78a6a7e2b3f59992e4a1232ee43ae66229adb239c1a93a9300c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94a30308df1fa84829796587ba632b0de64228a72b236d70d241f1e537950ad4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a0ba0cad810b61a37f1105cc9a5f0cc9e6a300fc2c709f098f068b6c8c257eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.632017 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.632074 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.632084 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.632101 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.632112 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:26Z","lastTransitionTime":"2025-11-22T02:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.642494 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b274f771d95b55e2d51a5e3a6e7fefdb8dfcb4ddca203af686bc439dc9861dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.662363 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.676972 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.697415 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fa3b16-e7cb-4515-8f5b-b749d413d1d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c520fdcda0655480d9f13999eb9eb646001f70adfb319eeceb231d0f25657894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16f14ccc3d087e429329cf6baac0b426a4a09fb47e95ee485151e8cfccd8b544\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c98d99408f3c360024f9969bfaedd6021a38686094ee1131a280fbfcb2b28e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83d9cca26b3003c2c5dfada813d6ff241396b642ededd3cb1ec2fed20a4b62bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9fb316dc12decc5509c05c7c6d3a7a4544a7bb576eab460f32c66daa8d9ec8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"message\\\":\\\"file observer\\\\nW1122 02:54:16.476130 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 02:54:16.476305 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:54:16.477918 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-217268551/tls.crt::/tmp/serving-cert-217268551/tls.key\\\\\\\"\\\\nI1122 02:54:16.975054 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 02:54:16.981218 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 02:54:16.981243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 02:54:16.981267 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 02:54:16.981272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 02:54:16.991103 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 02:54:16.991140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991145 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 02:54:16.991149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 02:54:16.991153 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 02:54:16.991155 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 02:54:16.991159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1122 02:54:16.991296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1122 02:54:17.006503 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59dd100539a8fed5dd0eefe587f3d299299e0b8c6c96c4f752b7d7dd397f4d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff4287c56c2939d6896a88ccf71f6243c3bb54aa6eecec30a218b069562559\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.710681 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095b61fc986aab4d0691312d402e549e5d4988e04d00ce20898cc01202ae5f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.735257 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.735306 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.735347 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.735368 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.735385 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:26Z","lastTransitionTime":"2025-11-22T02:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.738237 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b40468bea62f7fa68ccf47a00d09353678eb85c17469c2bec98094f34f8cc3a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b40468bea62f7fa68ccf47a00d09353678eb85c17469c2bec98094f34f8cc3a3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:55:11Z\\\",\\\"message\\\":\\\"to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:11Z is after 2025-08-24T17:21:41Z]\\\\nI1122 02:55:11.514755 7010 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"d7d7b270-1480-47f8-bdf9-690dbab310cb\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]strin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:55:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdmhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qnw6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.755814 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e94605-ee67-4d5b-8396-fbe7f8a1a6e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4581a1e39c030fb21b7c528ecba9701f4752146ff78a89af0f046bc8e04937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e95da560e78f4a7d306bc4bf8b33915a26b0db13e8c0c992728c7cb7fb68d80e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150b8ee37bd571d6cfd91f4a717b730e76994608a235ee5f3c45d5c961d9b579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eba7529c67e81fc405ee89376cc013a8057e45c67f80af6917368b3546f9f8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb349145c9eda95da7153846e6b00d8365e21c4663017cdee9b8179d383488b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0efdbea1fc77ff6e4f6d77a66fa0ea57e00a62a296f36f40ba96b180d5756623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f75e4226d55f052c559b2ad35d4085099dd7d73fe9ab0f7bb7a75b4e6b1aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:54:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj94l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ts9bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.771869 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc93cee4-8fff-4eac-b7ee-9a0e550c7d62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bbb9717ab2c2be566e304ddda4cb8e43d9010f4fa4a663bbf5734fb36399f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a681e7f405c868e2932037f2542ba2aa2666f8ff23e776d0c952974398d282fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66a6011947698e74ae244926c4fc492bde121b2f435911005424b9280325361a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b9a54b19cb808c8b10308e8656027ffb61703bf8d6454241138cfbbf4c17d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.785145 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4db386b4-babe-4dc8-bcdc-02763c1c602b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83b4302681a039e3cb3783b65bcda4bf5e2e0a03f656c79b871dd22105e1bf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af20c0e9e54a71a3edb2da902a21eac2c66032a80ac644bdb2aa89e99af10630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af20c0e9e54a71a3edb2da902a21eac2c66032a80ac644bdb2aa89e99af10630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.800020 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j9kg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccedfe81-43b3-4af7-88c7-9953b33e7d13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99a339123c6ff672532d842ec6714aa7588d6fdbc03f39380e1c715613526782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:55:05Z\\\",\\\"message\\\":\\\"2025-11-22T02:54:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a97ad85-d209-4196-9a77-c1f23a382ae5\\\\n2025-11-22T02:54:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a97ad85-d209-4196-9a77-c1f23a382ae5 to /host/opt/cni/bin/\\\\n2025-11-22T02:54:20Z [verbose] multus-daemon started\\\\n2025-11-22T02:54:20Z [verbose] Readiness Indicator file check\\\\n2025-11-22T02:55:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xc9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j9kg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.813371 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x6nk8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"299b06f8-5ba8-425d-96a5-2866e435b986\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6660ee2233282ef9b421e300a2708ebe56572f8e8f1788512b7e984384b7660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf8ws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x6nk8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.825825 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7wlpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9505980-28b9-46e1-85b2-ade5d1684ee7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231d1d4d1611eafa2a1f7324cb9c2dfc73ec3a42f3688e9a09b92820aca7dc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r9vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7wlpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.838583 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.838659 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.838672 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.838693 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.838724 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:26Z","lastTransitionTime":"2025-11-22T02:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.840919 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gkngm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zt95v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:54:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gkngm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.854734 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f84170c3-7fcb-489a-a0e3-ab3347683d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f31d8fd3d92c2ab76bf0d8d40713e11ac7d201c45f8e654efdb9d9bd506293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d3b91dd02108a2c438a8d217925002a1216ae0316c610ebb874cbc5e2d82396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adb4d7d41a42f13a7aa2b6a9d89c5e320027ddb852afd1fc96d5a9845d4c4e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://832e75b37c95a7fb9a6138626c24aa2344cbb4cfc1919e87f53b6a438a2b6a5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.866733 4952 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:55:26Z is after 2025-08-24T17:21:41Z" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.941326 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.941378 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.941389 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.941409 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:26 crc kubenswrapper[4952]: I1122 02:55:26.941426 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:26Z","lastTransitionTime":"2025-11-22T02:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.043901 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.043949 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.043962 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.043981 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.043993 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:27Z","lastTransitionTime":"2025-11-22T02:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.147432 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.147484 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.147495 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.147513 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.147529 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:27Z","lastTransitionTime":"2025-11-22T02:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.250348 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.250401 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.250412 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.250432 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.250443 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:27Z","lastTransitionTime":"2025-11-22T02:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.359613 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.359703 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.359726 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.359758 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.359782 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:27Z","lastTransitionTime":"2025-11-22T02:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.463470 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.463596 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.463613 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.463633 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.463646 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:27Z","lastTransitionTime":"2025-11-22T02:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.530457 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:27 crc kubenswrapper[4952]: E1122 02:55:27.530663 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.567461 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.567500 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.567511 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.567532 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.567620 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:27Z","lastTransitionTime":"2025-11-22T02:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.670861 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.670918 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.670934 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.670958 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.670974 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:27Z","lastTransitionTime":"2025-11-22T02:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.774303 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.774372 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.774389 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.774419 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.774436 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:27Z","lastTransitionTime":"2025-11-22T02:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.878107 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.878163 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.878176 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.878198 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.878210 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:27Z","lastTransitionTime":"2025-11-22T02:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.980870 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.980924 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.980936 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.980957 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:27 crc kubenswrapper[4952]: I1122 02:55:27.980973 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:27Z","lastTransitionTime":"2025-11-22T02:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.084437 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.084484 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.084493 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.084510 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.084521 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:28Z","lastTransitionTime":"2025-11-22T02:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.187397 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.187447 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.187457 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.187476 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.187493 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:28Z","lastTransitionTime":"2025-11-22T02:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.291349 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.291414 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.291426 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.291450 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.291464 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:28Z","lastTransitionTime":"2025-11-22T02:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.394835 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.394888 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.394901 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.394926 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.394941 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:28Z","lastTransitionTime":"2025-11-22T02:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.498164 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.498239 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.498260 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.498286 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.498347 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:28Z","lastTransitionTime":"2025-11-22T02:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.531261 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.531349 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:28 crc kubenswrapper[4952]: E1122 02:55:28.531626 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.531711 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:28 crc kubenswrapper[4952]: E1122 02:55:28.531818 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:28 crc kubenswrapper[4952]: E1122 02:55:28.531931 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.601516 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.601598 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.601612 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.601631 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.601642 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:28Z","lastTransitionTime":"2025-11-22T02:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.705394 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.705450 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.705461 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.705484 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.705497 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:28Z","lastTransitionTime":"2025-11-22T02:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.808691 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.808778 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.808793 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.808812 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.808826 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:28Z","lastTransitionTime":"2025-11-22T02:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.912017 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.912089 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.912112 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.912142 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:28 crc kubenswrapper[4952]: I1122 02:55:28.912161 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:28Z","lastTransitionTime":"2025-11-22T02:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.016573 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.016643 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.016658 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.016681 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.016695 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:29Z","lastTransitionTime":"2025-11-22T02:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.119523 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.119621 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.119643 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.119673 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.119697 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:29Z","lastTransitionTime":"2025-11-22T02:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.222398 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.222452 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.222465 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.222488 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.222502 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:29Z","lastTransitionTime":"2025-11-22T02:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.325505 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.325663 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.325684 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.325717 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.325746 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:29Z","lastTransitionTime":"2025-11-22T02:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.429410 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.429485 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.429511 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.429586 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.429615 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:29Z","lastTransitionTime":"2025-11-22T02:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.530875 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:29 crc kubenswrapper[4952]: E1122 02:55:29.531170 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.533364 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.533416 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.533433 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.533456 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.533477 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:29Z","lastTransitionTime":"2025-11-22T02:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.637207 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.637262 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.637280 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.637307 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.637322 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:29Z","lastTransitionTime":"2025-11-22T02:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.740450 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.740516 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.740531 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.740585 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.740601 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:29Z","lastTransitionTime":"2025-11-22T02:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.843860 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.843910 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.843919 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.843941 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.843953 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:29Z","lastTransitionTime":"2025-11-22T02:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.947686 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.947738 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.947750 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.947778 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:29 crc kubenswrapper[4952]: I1122 02:55:29.947802 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:29Z","lastTransitionTime":"2025-11-22T02:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.051026 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.051101 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.051159 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.051188 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.051208 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:30Z","lastTransitionTime":"2025-11-22T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.154206 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.154291 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.154315 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.154343 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.154363 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:30Z","lastTransitionTime":"2025-11-22T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.257456 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.257532 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.257588 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.257618 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.257639 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:30Z","lastTransitionTime":"2025-11-22T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.360991 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.361060 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.361077 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.361103 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.361123 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:30Z","lastTransitionTime":"2025-11-22T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.464329 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.464397 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.464416 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.464444 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.464462 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:30Z","lastTransitionTime":"2025-11-22T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.530457 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.530470 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:30 crc kubenswrapper[4952]: E1122 02:55:30.530759 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.530815 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:30 crc kubenswrapper[4952]: E1122 02:55:30.530983 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:30 crc kubenswrapper[4952]: E1122 02:55:30.531141 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.567717 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.567779 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.567798 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.567821 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.567838 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:30Z","lastTransitionTime":"2025-11-22T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.671528 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.671635 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.671659 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.671691 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.671715 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:30Z","lastTransitionTime":"2025-11-22T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.775303 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.775365 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.775375 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.775395 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.775406 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:30Z","lastTransitionTime":"2025-11-22T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.879112 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.879163 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.879173 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.879194 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.879206 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:30Z","lastTransitionTime":"2025-11-22T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.982221 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.982313 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.982332 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.982361 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:30 crc kubenswrapper[4952]: I1122 02:55:30.982381 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:30Z","lastTransitionTime":"2025-11-22T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.086014 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.086077 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.086095 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.086121 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.086138 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:31Z","lastTransitionTime":"2025-11-22T02:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.188863 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.188917 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.188929 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.188948 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.188963 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:31Z","lastTransitionTime":"2025-11-22T02:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.292803 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.292866 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.292889 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.292921 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.292942 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:31Z","lastTransitionTime":"2025-11-22T02:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.396083 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.396135 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.396156 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.396194 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.396232 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:31Z","lastTransitionTime":"2025-11-22T02:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.500097 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.500178 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.500204 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.500242 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.500279 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:31Z","lastTransitionTime":"2025-11-22T02:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.531064 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:31 crc kubenswrapper[4952]: E1122 02:55:31.531337 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.605509 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.605581 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.605594 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.605627 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.605651 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:31Z","lastTransitionTime":"2025-11-22T02:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.710127 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.710212 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.710238 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.710270 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.710295 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:31Z","lastTransitionTime":"2025-11-22T02:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.813638 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.813702 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.813723 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.813748 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.813768 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:31Z","lastTransitionTime":"2025-11-22T02:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.916509 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.916576 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.916589 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.916605 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:31 crc kubenswrapper[4952]: I1122 02:55:31.916616 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:31Z","lastTransitionTime":"2025-11-22T02:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.019800 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.019873 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.019893 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.019920 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.019942 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:32Z","lastTransitionTime":"2025-11-22T02:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.123151 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.123229 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.123253 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.123288 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.123311 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:32Z","lastTransitionTime":"2025-11-22T02:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.226839 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.226921 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.226940 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.226967 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.226986 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:32Z","lastTransitionTime":"2025-11-22T02:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.330972 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.331046 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.331065 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.331092 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.331115 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:32Z","lastTransitionTime":"2025-11-22T02:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.435471 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.435577 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.435601 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.435630 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.435650 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:32Z","lastTransitionTime":"2025-11-22T02:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.530694 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.530745 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.530768 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:32 crc kubenswrapper[4952]: E1122 02:55:32.530911 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:32 crc kubenswrapper[4952]: E1122 02:55:32.531061 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:32 crc kubenswrapper[4952]: E1122 02:55:32.531331 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.538830 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.538858 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.538870 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.538890 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.538906 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:32Z","lastTransitionTime":"2025-11-22T02:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.642200 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.642247 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.642259 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.642280 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.642292 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:32Z","lastTransitionTime":"2025-11-22T02:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.745500 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.745614 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.745634 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.745659 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.745679 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:32Z","lastTransitionTime":"2025-11-22T02:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.848908 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.848974 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.848994 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.849025 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.849044 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:32Z","lastTransitionTime":"2025-11-22T02:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.952300 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.952399 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.952422 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.952489 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:32 crc kubenswrapper[4952]: I1122 02:55:32.952511 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:32Z","lastTransitionTime":"2025-11-22T02:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.055713 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.055781 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.055795 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.055815 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.055829 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:33Z","lastTransitionTime":"2025-11-22T02:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.159590 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.159643 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.159729 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.159792 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.159807 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:33Z","lastTransitionTime":"2025-11-22T02:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.263405 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.264037 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.264144 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.264245 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.264337 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:33Z","lastTransitionTime":"2025-11-22T02:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.366927 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.366970 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.366980 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.366999 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.367011 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:33Z","lastTransitionTime":"2025-11-22T02:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.470194 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.470247 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.470266 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.470291 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.470308 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:33Z","lastTransitionTime":"2025-11-22T02:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.530505 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:33 crc kubenswrapper[4952]: E1122 02:55:33.530726 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.574046 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.574096 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.574115 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.574136 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.574149 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:33Z","lastTransitionTime":"2025-11-22T02:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.677187 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.677241 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.677255 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.677274 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.677284 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:33Z","lastTransitionTime":"2025-11-22T02:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.781222 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.781271 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.781293 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.781317 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.781328 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:33Z","lastTransitionTime":"2025-11-22T02:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.884852 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.884896 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.884905 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.884927 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.884939 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:33Z","lastTransitionTime":"2025-11-22T02:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.987887 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.987950 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.987964 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.987983 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:33 crc kubenswrapper[4952]: I1122 02:55:33.987996 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:33Z","lastTransitionTime":"2025-11-22T02:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.091570 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.091626 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.091638 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.091658 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.091671 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:34Z","lastTransitionTime":"2025-11-22T02:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.194209 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.194272 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.194283 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.194306 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.194317 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:34Z","lastTransitionTime":"2025-11-22T02:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.297101 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.297147 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.297161 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.297178 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.297189 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:34Z","lastTransitionTime":"2025-11-22T02:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.400334 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.400396 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.400503 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.400529 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.400574 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:34Z","lastTransitionTime":"2025-11-22T02:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.502741 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.502790 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.502803 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.502820 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.502832 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:34Z","lastTransitionTime":"2025-11-22T02:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.530456 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.530585 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.530654 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:34 crc kubenswrapper[4952]: E1122 02:55:34.530806 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:34 crc kubenswrapper[4952]: E1122 02:55:34.530933 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:34 crc kubenswrapper[4952]: E1122 02:55:34.531051 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.605915 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.605959 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.605971 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.605990 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.606003 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:34Z","lastTransitionTime":"2025-11-22T02:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.709156 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.709202 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.709215 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.709234 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.709245 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:34Z","lastTransitionTime":"2025-11-22T02:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.811859 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.811907 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.811919 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.811938 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.812022 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:34Z","lastTransitionTime":"2025-11-22T02:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.914577 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.914644 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.914669 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.914693 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:34 crc kubenswrapper[4952]: I1122 02:55:34.914711 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:34Z","lastTransitionTime":"2025-11-22T02:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.017786 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.017825 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.017834 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.017850 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.017860 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:35Z","lastTransitionTime":"2025-11-22T02:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.059934 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.059984 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.060006 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.060024 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.060035 4952 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:55:35Z","lastTransitionTime":"2025-11-22T02:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.110431 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n"] Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.110974 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.112863 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.113505 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.113508 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.116133 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.128626 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-x6nk8" podStartSLOduration=78.128606313 podStartE2EDuration="1m18.128606313s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:35.128573862 +0000 UTC m=+99.434591145" watchObservedRunningTime="2025-11-22 02:55:35.128606313 +0000 UTC m=+99.434623586" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.140782 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7wlpk" podStartSLOduration=78.140762146 podStartE2EDuration="1m18.140762146s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:35.140389176 +0000 UTC m=+99.446406449" watchObservedRunningTime="2025-11-22 02:55:35.140762146 +0000 UTC m=+99.446779419" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.183571 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=78.183519705 podStartE2EDuration="1m18.183519705s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:35.167695068 +0000 UTC m=+99.473712361" watchObservedRunningTime="2025-11-22 02:55:35.183519705 +0000 UTC m=+99.489536978" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.198101 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-j9kg2" podStartSLOduration=78.19807934 podStartE2EDuration="1m18.19807934s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:35.197487964 +0000 UTC m=+99.503505267" watchObservedRunningTime="2025-11-22 02:55:35.19807934 +0000 UTC m=+99.504096613" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.211761 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podStartSLOduration=78.211733131 podStartE2EDuration="1m18.211733131s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:35.211507615 +0000 UTC m=+99.517524928" watchObservedRunningTime="2025-11-22 02:55:35.211733131 +0000 UTC m=+99.517750404" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.214306 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0486de20-fbfe-4eda-a207-5ab4e28937e2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-stq8n\" (UID: \"0486de20-fbfe-4eda-a207-5ab4e28937e2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.214533 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0486de20-fbfe-4eda-a207-5ab4e28937e2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-stq8n\" (UID: \"0486de20-fbfe-4eda-a207-5ab4e28937e2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.214602 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0486de20-fbfe-4eda-a207-5ab4e28937e2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-stq8n\" (UID: \"0486de20-fbfe-4eda-a207-5ab4e28937e2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.214620 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0486de20-fbfe-4eda-a207-5ab4e28937e2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-stq8n\" (UID: \"0486de20-fbfe-4eda-a207-5ab4e28937e2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.214659 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0486de20-fbfe-4eda-a207-5ab4e28937e2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-stq8n\" (UID: \"0486de20-fbfe-4eda-a207-5ab4e28937e2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.296374 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=80.296340447 podStartE2EDuration="1m20.296340447s" podCreationTimestamp="2025-11-22 02:54:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:35.29453373 +0000 UTC m=+99.600551003" watchObservedRunningTime="2025-11-22 02:55:35.296340447 +0000 UTC m=+99.602357730" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.296884 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jkv7n" podStartSLOduration=78.29687456 podStartE2EDuration="1m18.29687456s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:35.228699767 +0000 UTC m=+99.534717070" watchObservedRunningTime="2025-11-22 02:55:35.29687456 +0000 UTC m=+99.602891833" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.316311 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0486de20-fbfe-4eda-a207-5ab4e28937e2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-stq8n\" (UID: \"0486de20-fbfe-4eda-a207-5ab4e28937e2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.316432 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0486de20-fbfe-4eda-a207-5ab4e28937e2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-stq8n\" (UID: \"0486de20-fbfe-4eda-a207-5ab4e28937e2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.316473 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0486de20-fbfe-4eda-a207-5ab4e28937e2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-stq8n\" (UID: \"0486de20-fbfe-4eda-a207-5ab4e28937e2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.316499 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0486de20-fbfe-4eda-a207-5ab4e28937e2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-stq8n\" (UID: \"0486de20-fbfe-4eda-a207-5ab4e28937e2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.316530 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0486de20-fbfe-4eda-a207-5ab4e28937e2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-stq8n\" (UID: \"0486de20-fbfe-4eda-a207-5ab4e28937e2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.316599 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0486de20-fbfe-4eda-a207-5ab4e28937e2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-stq8n\" (UID: \"0486de20-fbfe-4eda-a207-5ab4e28937e2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.316615 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0486de20-fbfe-4eda-a207-5ab4e28937e2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-stq8n\" (UID: \"0486de20-fbfe-4eda-a207-5ab4e28937e2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.317202 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0486de20-fbfe-4eda-a207-5ab4e28937e2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-stq8n\" (UID: \"0486de20-fbfe-4eda-a207-5ab4e28937e2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.323407 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0486de20-fbfe-4eda-a207-5ab4e28937e2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-stq8n\" (UID: \"0486de20-fbfe-4eda-a207-5ab4e28937e2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.333445 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0486de20-fbfe-4eda-a207-5ab4e28937e2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-stq8n\" (UID: \"0486de20-fbfe-4eda-a207-5ab4e28937e2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.425005 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.442828 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ts9bc" podStartSLOduration=78.442799084 podStartE2EDuration="1m18.442799084s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:35.442224498 +0000 UTC m=+99.748241781" watchObservedRunningTime="2025-11-22 02:55:35.442799084 +0000 UTC m=+99.748816357" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.491782 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=31.491753373 podStartE2EDuration="31.491753373s" podCreationTimestamp="2025-11-22 02:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:35.491398193 +0000 UTC m=+99.797415486" watchObservedRunningTime="2025-11-22 02:55:35.491753373 +0000 UTC m=+99.797770656" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.492321 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=44.492313827 podStartE2EDuration="44.492313827s" podCreationTimestamp="2025-11-22 02:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:35.478397029 +0000 UTC m=+99.784414312" watchObservedRunningTime="2025-11-22 02:55:35.492313827 +0000 UTC m=+99.798331110" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.510914 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.510887544 podStartE2EDuration="1m18.510887544s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:35.50957414 +0000 UTC m=+99.815591413" watchObservedRunningTime="2025-11-22 02:55:35.510887544 +0000 UTC m=+99.816904817" Nov 22 02:55:35 crc kubenswrapper[4952]: I1122 02:55:35.530870 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:35 crc kubenswrapper[4952]: E1122 02:55:35.531039 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:36 crc kubenswrapper[4952]: I1122 02:55:36.197945 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n" event={"ID":"0486de20-fbfe-4eda-a207-5ab4e28937e2","Type":"ContainerStarted","Data":"c41754eb4801abbdc04ac23f2b7b85c998c28e2107b0700489d57b431f789f60"} Nov 22 02:55:36 crc kubenswrapper[4952]: I1122 02:55:36.198399 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n" event={"ID":"0486de20-fbfe-4eda-a207-5ab4e28937e2","Type":"ContainerStarted","Data":"eb0ad9ae722b9c2d5208993f1428ddaef978adba74c40a6ed0de9b9a9458a988"} Nov 22 02:55:36 crc kubenswrapper[4952]: I1122 02:55:36.215866 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stq8n" podStartSLOduration=79.215839695 podStartE2EDuration="1m19.215839695s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:36.212400066 +0000 UTC m=+100.518417379" watchObservedRunningTime="2025-11-22 02:55:36.215839695 +0000 UTC m=+100.521856978" Nov 22 02:55:36 crc kubenswrapper[4952]: I1122 02:55:36.530467 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:36 crc kubenswrapper[4952]: I1122 02:55:36.530639 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:36 crc kubenswrapper[4952]: I1122 02:55:36.530601 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:36 crc kubenswrapper[4952]: E1122 02:55:36.533285 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:36 crc kubenswrapper[4952]: E1122 02:55:36.533706 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:36 crc kubenswrapper[4952]: E1122 02:55:36.534097 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:36 crc kubenswrapper[4952]: I1122 02:55:36.837057 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs\") pod \"network-metrics-daemon-gkngm\" (UID: \"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\") " pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:36 crc kubenswrapper[4952]: E1122 02:55:36.837238 4952 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:55:36 crc kubenswrapper[4952]: E1122 02:55:36.837312 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs podName:c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc nodeName:}" failed. No retries permitted until 2025-11-22 02:56:40.837290897 +0000 UTC m=+165.143308170 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs") pod "network-metrics-daemon-gkngm" (UID: "c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:55:37 crc kubenswrapper[4952]: I1122 02:55:37.530204 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:37 crc kubenswrapper[4952]: E1122 02:55:37.531170 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:37 crc kubenswrapper[4952]: I1122 02:55:37.531755 4952 scope.go:117] "RemoveContainer" containerID="b40468bea62f7fa68ccf47a00d09353678eb85c17469c2bec98094f34f8cc3a3" Nov 22 02:55:37 crc kubenswrapper[4952]: E1122 02:55:37.531903 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qnw6b_openshift-ovn-kubernetes(bef051cd-2285-4b6b-a16f-1154f4d1f5dd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" Nov 22 02:55:38 crc kubenswrapper[4952]: I1122 02:55:38.530428 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:38 crc kubenswrapper[4952]: I1122 02:55:38.530493 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:38 crc kubenswrapper[4952]: I1122 02:55:38.530594 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:38 crc kubenswrapper[4952]: E1122 02:55:38.530708 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:38 crc kubenswrapper[4952]: E1122 02:55:38.530778 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:38 crc kubenswrapper[4952]: E1122 02:55:38.530913 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:39 crc kubenswrapper[4952]: I1122 02:55:39.530741 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:39 crc kubenswrapper[4952]: E1122 02:55:39.531626 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:40 crc kubenswrapper[4952]: I1122 02:55:40.530629 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:40 crc kubenswrapper[4952]: I1122 02:55:40.530674 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:40 crc kubenswrapper[4952]: E1122 02:55:40.530835 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:40 crc kubenswrapper[4952]: I1122 02:55:40.530884 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:40 crc kubenswrapper[4952]: E1122 02:55:40.531075 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:40 crc kubenswrapper[4952]: E1122 02:55:40.531123 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:41 crc kubenswrapper[4952]: I1122 02:55:41.585709 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:41 crc kubenswrapper[4952]: I1122 02:55:41.585922 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:41 crc kubenswrapper[4952]: E1122 02:55:41.586057 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:41 crc kubenswrapper[4952]: E1122 02:55:41.586697 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:42 crc kubenswrapper[4952]: I1122 02:55:42.531141 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:42 crc kubenswrapper[4952]: I1122 02:55:42.531149 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:42 crc kubenswrapper[4952]: E1122 02:55:42.531293 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:42 crc kubenswrapper[4952]: E1122 02:55:42.531468 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:43 crc kubenswrapper[4952]: I1122 02:55:43.530622 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:43 crc kubenswrapper[4952]: I1122 02:55:43.530668 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:43 crc kubenswrapper[4952]: E1122 02:55:43.530867 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:43 crc kubenswrapper[4952]: E1122 02:55:43.531046 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:44 crc kubenswrapper[4952]: I1122 02:55:44.531084 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:44 crc kubenswrapper[4952]: I1122 02:55:44.531084 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:44 crc kubenswrapper[4952]: E1122 02:55:44.531499 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:44 crc kubenswrapper[4952]: E1122 02:55:44.531662 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:45 crc kubenswrapper[4952]: I1122 02:55:45.530095 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:45 crc kubenswrapper[4952]: I1122 02:55:45.530095 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:45 crc kubenswrapper[4952]: E1122 02:55:45.530449 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:45 crc kubenswrapper[4952]: E1122 02:55:45.530291 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:46 crc kubenswrapper[4952]: I1122 02:55:46.530163 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:46 crc kubenswrapper[4952]: I1122 02:55:46.530213 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:46 crc kubenswrapper[4952]: E1122 02:55:46.532511 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:46 crc kubenswrapper[4952]: E1122 02:55:46.532732 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:47 crc kubenswrapper[4952]: I1122 02:55:47.530128 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:47 crc kubenswrapper[4952]: I1122 02:55:47.530178 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:47 crc kubenswrapper[4952]: E1122 02:55:47.530303 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:47 crc kubenswrapper[4952]: E1122 02:55:47.530702 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:48 crc kubenswrapper[4952]: I1122 02:55:48.530887 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:48 crc kubenswrapper[4952]: I1122 02:55:48.531002 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:48 crc kubenswrapper[4952]: E1122 02:55:48.531152 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:48 crc kubenswrapper[4952]: E1122 02:55:48.531348 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:49 crc kubenswrapper[4952]: I1122 02:55:49.531165 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:49 crc kubenswrapper[4952]: I1122 02:55:49.531302 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:49 crc kubenswrapper[4952]: E1122 02:55:49.531408 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:49 crc kubenswrapper[4952]: E1122 02:55:49.531595 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:50 crc kubenswrapper[4952]: I1122 02:55:50.530292 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:50 crc kubenswrapper[4952]: I1122 02:55:50.530293 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:50 crc kubenswrapper[4952]: E1122 02:55:50.530426 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:50 crc kubenswrapper[4952]: E1122 02:55:50.530994 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:51 crc kubenswrapper[4952]: I1122 02:55:51.530539 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:51 crc kubenswrapper[4952]: I1122 02:55:51.530673 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:51 crc kubenswrapper[4952]: E1122 02:55:51.532024 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:51 crc kubenswrapper[4952]: E1122 02:55:51.532084 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:52 crc kubenswrapper[4952]: I1122 02:55:52.266699 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-j9kg2_ccedfe81-43b3-4af7-88c7-9953b33e7d13/kube-multus/1.log" Nov 22 02:55:52 crc kubenswrapper[4952]: I1122 02:55:52.267748 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-j9kg2_ccedfe81-43b3-4af7-88c7-9953b33e7d13/kube-multus/0.log" Nov 22 02:55:52 crc kubenswrapper[4952]: I1122 02:55:52.267829 4952 generic.go:334] "Generic (PLEG): container finished" podID="ccedfe81-43b3-4af7-88c7-9953b33e7d13" containerID="99a339123c6ff672532d842ec6714aa7588d6fdbc03f39380e1c715613526782" exitCode=1 Nov 22 02:55:52 crc kubenswrapper[4952]: I1122 02:55:52.267898 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j9kg2" event={"ID":"ccedfe81-43b3-4af7-88c7-9953b33e7d13","Type":"ContainerDied","Data":"99a339123c6ff672532d842ec6714aa7588d6fdbc03f39380e1c715613526782"} Nov 22 02:55:52 crc kubenswrapper[4952]: I1122 02:55:52.268003 4952 scope.go:117] "RemoveContainer" containerID="6c10cbb5de2dcab5d4375815ae638bc06510d9b1945a92ff6c800544bb763489" Nov 22 02:55:52 crc kubenswrapper[4952]: I1122 02:55:52.269129 4952 scope.go:117] "RemoveContainer" containerID="99a339123c6ff672532d842ec6714aa7588d6fdbc03f39380e1c715613526782" Nov 22 02:55:52 crc kubenswrapper[4952]: E1122 02:55:52.269502 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-j9kg2_openshift-multus(ccedfe81-43b3-4af7-88c7-9953b33e7d13)\"" pod="openshift-multus/multus-j9kg2" podUID="ccedfe81-43b3-4af7-88c7-9953b33e7d13" Nov 22 02:55:52 crc kubenswrapper[4952]: I1122 02:55:52.530771 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:52 crc kubenswrapper[4952]: I1122 02:55:52.530835 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:52 crc kubenswrapper[4952]: E1122 02:55:52.531108 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:52 crc kubenswrapper[4952]: E1122 02:55:52.531863 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:52 crc kubenswrapper[4952]: I1122 02:55:52.532595 4952 scope.go:117] "RemoveContainer" containerID="b40468bea62f7fa68ccf47a00d09353678eb85c17469c2bec98094f34f8cc3a3" Nov 22 02:55:53 crc kubenswrapper[4952]: I1122 02:55:53.273974 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnw6b_bef051cd-2285-4b6b-a16f-1154f4d1f5dd/ovnkube-controller/3.log" Nov 22 02:55:53 crc kubenswrapper[4952]: I1122 02:55:53.278309 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerStarted","Data":"3fb24293905b96fde1401820f7e9381bd272a84454d51902aac7e20cd67feba0"} Nov 22 02:55:53 crc kubenswrapper[4952]: I1122 02:55:53.278983 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:55:53 crc kubenswrapper[4952]: I1122 02:55:53.280810 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-j9kg2_ccedfe81-43b3-4af7-88c7-9953b33e7d13/kube-multus/1.log" Nov 22 02:55:53 crc kubenswrapper[4952]: I1122 02:55:53.305968 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" podStartSLOduration=96.305940311 podStartE2EDuration="1m36.305940311s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:53.305342785 +0000 UTC m=+117.611360068" watchObservedRunningTime="2025-11-22 02:55:53.305940311 +0000 UTC m=+117.611957584" Nov 22 02:55:53 crc kubenswrapper[4952]: I1122 02:55:53.515999 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gkngm"] Nov 22 02:55:53 crc kubenswrapper[4952]: I1122 02:55:53.516148 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:53 crc kubenswrapper[4952]: E1122 02:55:53.516248 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:53 crc kubenswrapper[4952]: I1122 02:55:53.530092 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:53 crc kubenswrapper[4952]: E1122 02:55:53.530250 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:54 crc kubenswrapper[4952]: I1122 02:55:54.530815 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:54 crc kubenswrapper[4952]: E1122 02:55:54.531075 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:54 crc kubenswrapper[4952]: I1122 02:55:54.530837 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:54 crc kubenswrapper[4952]: E1122 02:55:54.531307 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:55 crc kubenswrapper[4952]: I1122 02:55:55.531424 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:55 crc kubenswrapper[4952]: I1122 02:55:55.531466 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:55 crc kubenswrapper[4952]: E1122 02:55:55.533419 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:55 crc kubenswrapper[4952]: E1122 02:55:55.533445 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:56 crc kubenswrapper[4952]: E1122 02:55:56.510053 4952 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Nov 22 02:55:56 crc kubenswrapper[4952]: I1122 02:55:56.531062 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:56 crc kubenswrapper[4952]: I1122 02:55:56.531139 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:56 crc kubenswrapper[4952]: E1122 02:55:56.533158 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:56 crc kubenswrapper[4952]: E1122 02:55:56.533349 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:56 crc kubenswrapper[4952]: E1122 02:55:56.653206 4952 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 02:55:57 crc kubenswrapper[4952]: I1122 02:55:57.530970 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:57 crc kubenswrapper[4952]: I1122 02:55:57.531015 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:57 crc kubenswrapper[4952]: E1122 02:55:57.531659 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:57 crc kubenswrapper[4952]: E1122 02:55:57.531859 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:55:58 crc kubenswrapper[4952]: I1122 02:55:58.530954 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:58 crc kubenswrapper[4952]: I1122 02:55:58.531005 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:58 crc kubenswrapper[4952]: E1122 02:55:58.531166 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:58 crc kubenswrapper[4952]: E1122 02:55:58.531537 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:59 crc kubenswrapper[4952]: I1122 02:55:59.531143 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:59 crc kubenswrapper[4952]: I1122 02:55:59.531182 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:55:59 crc kubenswrapper[4952]: E1122 02:55:59.531346 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:59 crc kubenswrapper[4952]: E1122 02:55:59.531487 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:56:00 crc kubenswrapper[4952]: I1122 02:56:00.531215 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:56:00 crc kubenswrapper[4952]: E1122 02:56:00.531471 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:56:00 crc kubenswrapper[4952]: I1122 02:56:00.531631 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:56:00 crc kubenswrapper[4952]: E1122 02:56:00.531810 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:56:01 crc kubenswrapper[4952]: I1122 02:56:01.530354 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:56:01 crc kubenswrapper[4952]: I1122 02:56:01.530413 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:56:01 crc kubenswrapper[4952]: E1122 02:56:01.530664 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:56:01 crc kubenswrapper[4952]: E1122 02:56:01.530803 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:56:01 crc kubenswrapper[4952]: E1122 02:56:01.654973 4952 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 02:56:02 crc kubenswrapper[4952]: I1122 02:56:02.530761 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:56:02 crc kubenswrapper[4952]: I1122 02:56:02.530829 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:56:02 crc kubenswrapper[4952]: E1122 02:56:02.530990 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:56:02 crc kubenswrapper[4952]: E1122 02:56:02.531373 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:56:03 crc kubenswrapper[4952]: I1122 02:56:03.531068 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:56:03 crc kubenswrapper[4952]: I1122 02:56:03.531132 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:56:03 crc kubenswrapper[4952]: E1122 02:56:03.531713 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:56:03 crc kubenswrapper[4952]: E1122 02:56:03.532210 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:56:04 crc kubenswrapper[4952]: I1122 02:56:04.531271 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:56:04 crc kubenswrapper[4952]: I1122 02:56:04.531378 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:56:04 crc kubenswrapper[4952]: E1122 02:56:04.531524 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:56:04 crc kubenswrapper[4952]: I1122 02:56:04.531780 4952 scope.go:117] "RemoveContainer" containerID="99a339123c6ff672532d842ec6714aa7588d6fdbc03f39380e1c715613526782" Nov 22 02:56:04 crc kubenswrapper[4952]: E1122 02:56:04.532182 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:56:05 crc kubenswrapper[4952]: I1122 02:56:05.332655 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-j9kg2_ccedfe81-43b3-4af7-88c7-9953b33e7d13/kube-multus/1.log" Nov 22 02:56:05 crc kubenswrapper[4952]: I1122 02:56:05.332735 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j9kg2" event={"ID":"ccedfe81-43b3-4af7-88c7-9953b33e7d13","Type":"ContainerStarted","Data":"cebc1e28cbfdd4056d2727f1ad546c42aae332550aa30af0ab61c05720129d31"} Nov 22 02:56:05 crc kubenswrapper[4952]: I1122 02:56:05.530408 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:56:05 crc kubenswrapper[4952]: I1122 02:56:05.530526 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:56:05 crc kubenswrapper[4952]: E1122 02:56:05.530597 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:56:05 crc kubenswrapper[4952]: E1122 02:56:05.530751 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gkngm" podUID="c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc" Nov 22 02:56:06 crc kubenswrapper[4952]: I1122 02:56:06.531000 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:56:06 crc kubenswrapper[4952]: I1122 02:56:06.531045 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:56:06 crc kubenswrapper[4952]: E1122 02:56:06.532940 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:56:06 crc kubenswrapper[4952]: E1122 02:56:06.533059 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:56:07 crc kubenswrapper[4952]: I1122 02:56:07.530877 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:56:07 crc kubenswrapper[4952]: I1122 02:56:07.530962 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:56:07 crc kubenswrapper[4952]: I1122 02:56:07.534762 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 22 02:56:07 crc kubenswrapper[4952]: I1122 02:56:07.535097 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 22 02:56:07 crc kubenswrapper[4952]: I1122 02:56:07.535193 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 22 02:56:07 crc kubenswrapper[4952]: I1122 02:56:07.535722 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 22 02:56:08 crc kubenswrapper[4952]: I1122 02:56:08.530955 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:56:08 crc kubenswrapper[4952]: I1122 02:56:08.530968 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:56:08 crc kubenswrapper[4952]: I1122 02:56:08.535378 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 22 02:56:08 crc kubenswrapper[4952]: I1122 02:56:08.538432 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 22 02:56:12 crc kubenswrapper[4952]: I1122 02:56:12.846631 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.095794 4952 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.164393 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-dfwdz"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.166901 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z6kn5"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.167190 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.178086 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.178298 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.178486 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.178633 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.178656 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.178671 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.178708 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.178891 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.182362 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.182756 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.191164 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.191882 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mz7ld"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.192373 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vpkgq"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.192763 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-8qgqp"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.193449 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8qgqp" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.193907 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kkh7f"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.194010 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.195994 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sjc2j"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.196335 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.196674 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.196844 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.197130 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.197252 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.197466 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.197589 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.199338 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.199679 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.200136 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.200467 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.201012 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mz7ld" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.201115 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.201662 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.202638 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sjc2j" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.202947 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-kkh7f" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.203767 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.208118 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hcvgc"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.208898 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hcvgc" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.209590 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.210999 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.211776 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.212091 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.212485 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.221079 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.221123 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.221238 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.221495 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.221736 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.221749 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.221927 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.222068 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.222203 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.222272 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.222303 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.222319 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.222404 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.222491 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.222511 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.222576 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.222648 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.222677 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.222719 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.222747 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.222769 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.222748 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.222825 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.222831 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.222863 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.222884 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.222974 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.223291 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.223424 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.223861 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.224021 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.224048 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b72acca1-8338-4e9c-9fa5-55616766e8a9-image-import-ca\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.224086 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b72acca1-8338-4e9c-9fa5-55616766e8a9-etcd-serving-ca\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.224125 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b72acca1-8338-4e9c-9fa5-55616766e8a9-config\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.224144 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.224247 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b72acca1-8338-4e9c-9fa5-55616766e8a9-node-pullsecrets\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.224278 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b72acca1-8338-4e9c-9fa5-55616766e8a9-encryption-config\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.224314 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b72acca1-8338-4e9c-9fa5-55616766e8a9-audit-dir\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.224354 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mskvt\" (UniqueName: \"kubernetes.io/projected/b72acca1-8338-4e9c-9fa5-55616766e8a9-kube-api-access-mskvt\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.224450 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b72acca1-8338-4e9c-9fa5-55616766e8a9-trusted-ca-bundle\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.224939 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b72acca1-8338-4e9c-9fa5-55616766e8a9-etcd-client\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.224997 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b72acca1-8338-4e9c-9fa5-55616766e8a9-audit\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.225051 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b72acca1-8338-4e9c-9fa5-55616766e8a9-serving-cert\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.227337 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zmfcx"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.238369 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zmfcx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.239796 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wwch7"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.240825 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wwch7" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.246390 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.248879 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.249313 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.249683 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.249971 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.250083 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.250711 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.251049 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.251167 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.251399 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.251559 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.251700 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.251716 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.251814 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.251845 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.251877 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.252484 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.252610 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.252791 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.252917 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.253348 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.255856 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.256400 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.258384 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.259119 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.261712 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.263532 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.264425 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.264509 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vg8j9"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.264638 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.264706 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.264842 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.264904 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.265015 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.266629 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.290018 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-kkdb8"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.290493 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6nsx4"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.290627 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-kkdb8" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.290661 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vg8j9" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.291355 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6nsx4" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.293123 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.295852 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.296041 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.296116 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.296428 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.296704 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.298210 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.298644 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.298711 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.299253 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.299418 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.301862 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-89rxq"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.302733 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.303427 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.303721 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.308040 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jgbvn"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.308737 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jgbvn" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.309650 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rlp29"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.310010 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rlp29" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.312759 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.313304 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5gwx5"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.314072 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-5gwx5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.314301 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.316095 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.318171 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sw6q8"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.319234 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sw6q8" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.322773 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.325218 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.325760 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b72acca1-8338-4e9c-9fa5-55616766e8a9-audit-dir\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.325797 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f436461-6865-411d-9c2d-8c5794d1b4ab-serving-cert\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.325817 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7f436461-6865-411d-9c2d-8c5794d1b4ab-audit-dir\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.325832 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4f6bb21-bb06-4b91-a1ff-15b596f1f92f-config\") pod \"console-operator-58897d9998-vg8j9\" (UID: \"f4f6bb21-bb06-4b91-a1ff-15b596f1f92f\") " pod="openshift-console-operator/console-operator-58897d9998-vg8j9" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.325847 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7nm8\" (UniqueName: \"kubernetes.io/projected/af11e3ed-3c58-4ad5-9da7-38b9950ff726-kube-api-access-f7nm8\") pod \"downloads-7954f5f757-kkdb8\" (UID: \"af11e3ed-3c58-4ad5-9da7-38b9950ff726\") " pod="openshift-console/downloads-7954f5f757-kkdb8" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.325875 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mskvt\" (UniqueName: \"kubernetes.io/projected/b72acca1-8338-4e9c-9fa5-55616766e8a9-kube-api-access-mskvt\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.325895 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.325917 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-864dz\" (UniqueName: \"kubernetes.io/projected/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-kube-api-access-864dz\") pod \"route-controller-manager-6576b87f9c-fgnfx\" (UID: \"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.325934 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dda8f37-91e7-4ddc-bd94-8caa6f422c7c-service-ca-bundle\") pod \"authentication-operator-69f744f599-kkh7f\" (UID: \"3dda8f37-91e7-4ddc-bd94-8caa6f422c7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkh7f" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.325950 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-client-ca\") pod \"route-controller-manager-6576b87f9c-fgnfx\" (UID: \"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.325965 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.325982 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpd97\" (UniqueName: \"kubernetes.io/projected/8a0cf8dd-fc29-442b-9ff2-7360946df755-kube-api-access-fpd97\") pod \"openshift-controller-manager-operator-756b6f6bc6-sjc2j\" (UID: \"8a0cf8dd-fc29-442b-9ff2-7360946df755\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sjc2j" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.325997 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4f6bb21-bb06-4b91-a1ff-15b596f1f92f-serving-cert\") pod \"console-operator-58897d9998-vg8j9\" (UID: \"f4f6bb21-bb06-4b91-a1ff-15b596f1f92f\") " pod="openshift-console-operator/console-operator-58897d9998-vg8j9" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326013 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7f436461-6865-411d-9c2d-8c5794d1b4ab-encryption-config\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326032 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc8974e3-11f8-4820-ac5b-d70b337ecd4c-serving-cert\") pod \"openshift-config-operator-7777fb866f-4vlh5\" (UID: \"bc8974e3-11f8-4820-ac5b-d70b337ecd4c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326082 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ade98cb-8582-4066-b635-e837a190302d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6nsx4\" (UID: \"7ade98cb-8582-4066-b635-e837a190302d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6nsx4" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326108 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b72acca1-8338-4e9c-9fa5-55616766e8a9-serving-cert\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326124 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326140 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a0cf8dd-fc29-442b-9ff2-7360946df755-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sjc2j\" (UID: \"8a0cf8dd-fc29-442b-9ff2-7360946df755\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sjc2j" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326156 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63bdcada-d1ad-45eb-b290-42b2b8dd8257-serving-cert\") pod \"controller-manager-879f6c89f-z6kn5\" (UID: \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326172 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcf88\" (UniqueName: \"kubernetes.io/projected/63bdcada-d1ad-45eb-b290-42b2b8dd8257-kube-api-access-fcf88\") pod \"controller-manager-879f6c89f-z6kn5\" (UID: \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326188 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e043e178-e7e5-4ddf-b561-7253433d6e81-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mz7ld\" (UID: \"e043e178-e7e5-4ddf-b561-7253433d6e81\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mz7ld" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326203 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ce050654-a42c-4472-9990-581502ae1830-srv-cert\") pod \"catalog-operator-68c6474976-zmfcx\" (UID: \"ce050654-a42c-4472-9990-581502ae1830\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zmfcx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326218 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-audit-policies\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326233 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326256 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326276 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w7qz\" (UniqueName: \"kubernetes.io/projected/a4782fdd-4348-4995-af8f-eb6d61183dec-kube-api-access-4w7qz\") pod \"cluster-samples-operator-665b6dd947-hcvgc\" (UID: \"a4782fdd-4348-4995-af8f-eb6d61183dec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hcvgc" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326298 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4f6bb21-bb06-4b91-a1ff-15b596f1f92f-trusted-ca\") pod \"console-operator-58897d9998-vg8j9\" (UID: \"f4f6bb21-bb06-4b91-a1ff-15b596f1f92f\") " pod="openshift-console-operator/console-operator-58897d9998-vg8j9" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326320 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc8974e3-11f8-4820-ac5b-d70b337ecd4c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-4vlh5\" (UID: \"bc8974e3-11f8-4820-ac5b-d70b337ecd4c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326339 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/63bdcada-d1ad-45eb-b290-42b2b8dd8257-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z6kn5\" (UID: \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326356 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63bdcada-d1ad-45eb-b290-42b2b8dd8257-config\") pod \"controller-manager-879f6c89f-z6kn5\" (UID: \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326394 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b72acca1-8338-4e9c-9fa5-55616766e8a9-etcd-serving-ca\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326411 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ade98cb-8582-4066-b635-e837a190302d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6nsx4\" (UID: \"7ade98cb-8582-4066-b635-e837a190302d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6nsx4" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326431 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxrrj\" (UniqueName: \"kubernetes.io/projected/ef466a9b-34cc-4282-ad35-96731b58b8c3-kube-api-access-bxrrj\") pod \"machine-approver-56656f9798-8qgqp\" (UID: \"ef466a9b-34cc-4282-ad35-96731b58b8c3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8qgqp" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326448 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr5xv\" (UniqueName: \"kubernetes.io/projected/534f7f1e-7321-49f1-8d68-7b356c28b8ac-kube-api-access-rr5xv\") pod \"openshift-apiserver-operator-796bbdcf4f-wwch7\" (UID: \"534f7f1e-7321-49f1-8d68-7b356c28b8ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wwch7" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326467 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj8vq\" (UniqueName: \"kubernetes.io/projected/ce050654-a42c-4472-9990-581502ae1830-kube-api-access-rj8vq\") pod \"catalog-operator-68c6474976-zmfcx\" (UID: \"ce050654-a42c-4472-9990-581502ae1830\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zmfcx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326487 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7f436461-6865-411d-9c2d-8c5794d1b4ab-audit-policies\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326504 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrg44\" (UniqueName: \"kubernetes.io/projected/7ade98cb-8582-4066-b635-e837a190302d-kube-api-access-nrg44\") pod \"cluster-image-registry-operator-dc59b4c8b-6nsx4\" (UID: \"7ade98cb-8582-4066-b635-e837a190302d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6nsx4" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326535 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326572 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e043e178-e7e5-4ddf-b561-7253433d6e81-images\") pod \"machine-api-operator-5694c8668f-mz7ld\" (UID: \"e043e178-e7e5-4ddf-b561-7253433d6e81\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mz7ld" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326589 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dda8f37-91e7-4ddc-bd94-8caa6f422c7c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kkh7f\" (UID: \"3dda8f37-91e7-4ddc-bd94-8caa6f422c7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkh7f" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326606 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bl8d\" (UniqueName: \"kubernetes.io/projected/f4f6bb21-bb06-4b91-a1ff-15b596f1f92f-kube-api-access-6bl8d\") pod \"console-operator-58897d9998-vg8j9\" (UID: \"f4f6bb21-bb06-4b91-a1ff-15b596f1f92f\") " pod="openshift-console-operator/console-operator-58897d9998-vg8j9" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326621 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/534f7f1e-7321-49f1-8d68-7b356c28b8ac-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wwch7\" (UID: \"534f7f1e-7321-49f1-8d68-7b356c28b8ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wwch7" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326635 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cznss\" (UniqueName: \"kubernetes.io/projected/3dda8f37-91e7-4ddc-bd94-8caa6f422c7c-kube-api-access-cznss\") pod \"authentication-operator-69f744f599-kkh7f\" (UID: \"3dda8f37-91e7-4ddc-bd94-8caa6f422c7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkh7f" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326652 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4782fdd-4348-4995-af8f-eb6d61183dec-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-hcvgc\" (UID: \"a4782fdd-4348-4995-af8f-eb6d61183dec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hcvgc" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326668 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b72acca1-8338-4e9c-9fa5-55616766e8a9-etcd-client\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326684 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b72acca1-8338-4e9c-9fa5-55616766e8a9-trusted-ca-bundle\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326702 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdvvd\" (UniqueName: \"kubernetes.io/projected/7f436461-6865-411d-9c2d-8c5794d1b4ab-kube-api-access-gdvvd\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326717 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0cf8dd-fc29-442b-9ff2-7360946df755-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sjc2j\" (UID: \"8a0cf8dd-fc29-442b-9ff2-7360946df755\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sjc2j" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326733 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b72acca1-8338-4e9c-9fa5-55616766e8a9-audit\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326747 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7f436461-6865-411d-9c2d-8c5794d1b4ab-etcd-client\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.326762 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f436461-6865-411d-9c2d-8c5794d1b4ab-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.327093 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b72acca1-8338-4e9c-9fa5-55616766e8a9-audit-dir\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.328173 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b72acca1-8338-4e9c-9fa5-55616766e8a9-etcd-serving-ca\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.328181 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p9bf7"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.328712 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-serving-cert\") pod \"route-controller-manager-6576b87f9c-fgnfx\" (UID: \"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.328748 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-audit-dir\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.328771 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.328793 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6flq\" (UniqueName: \"kubernetes.io/projected/e043e178-e7e5-4ddf-b561-7253433d6e81-kube-api-access-t6flq\") pod \"machine-api-operator-5694c8668f-mz7ld\" (UID: \"e043e178-e7e5-4ddf-b561-7253433d6e81\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mz7ld" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.328810 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-config\") pod \"route-controller-manager-6576b87f9c-fgnfx\" (UID: \"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.328816 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-2ggzk"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.328830 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.328848 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ade98cb-8582-4066-b635-e837a190302d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6nsx4\" (UID: \"7ade98cb-8582-4066-b635-e837a190302d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6nsx4" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.328867 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63bdcada-d1ad-45eb-b290-42b2b8dd8257-client-ca\") pod \"controller-manager-879f6c89f-z6kn5\" (UID: \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.328886 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/534f7f1e-7321-49f1-8d68-7b356c28b8ac-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wwch7\" (UID: \"534f7f1e-7321-49f1-8d68-7b356c28b8ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wwch7" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.328902 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dda8f37-91e7-4ddc-bd94-8caa6f422c7c-serving-cert\") pod \"authentication-operator-69f744f599-kkh7f\" (UID: \"3dda8f37-91e7-4ddc-bd94-8caa6f422c7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkh7f" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.328923 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.328942 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.328959 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dda8f37-91e7-4ddc-bd94-8caa6f422c7c-config\") pod \"authentication-operator-69f744f599-kkh7f\" (UID: \"3dda8f37-91e7-4ddc-bd94-8caa6f422c7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkh7f" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.328979 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef466a9b-34cc-4282-ad35-96731b58b8c3-config\") pod \"machine-approver-56656f9798-8qgqp\" (UID: \"ef466a9b-34cc-4282-ad35-96731b58b8c3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8qgqp" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.328994 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e043e178-e7e5-4ddf-b561-7253433d6e81-config\") pod \"machine-api-operator-5694c8668f-mz7ld\" (UID: \"e043e178-e7e5-4ddf-b561-7253433d6e81\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mz7ld" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.329014 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b72acca1-8338-4e9c-9fa5-55616766e8a9-image-import-ca\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.329034 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n9h6\" (UniqueName: \"kubernetes.io/projected/bc8974e3-11f8-4820-ac5b-d70b337ecd4c-kube-api-access-5n9h6\") pod \"openshift-config-operator-7777fb866f-4vlh5\" (UID: \"bc8974e3-11f8-4820-ac5b-d70b337ecd4c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.329052 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ef466a9b-34cc-4282-ad35-96731b58b8c3-machine-approver-tls\") pod \"machine-approver-56656f9798-8qgqp\" (UID: \"ef466a9b-34cc-4282-ad35-96731b58b8c3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8qgqp" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.329073 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b72acca1-8338-4e9c-9fa5-55616766e8a9-config\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.329095 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsj25\" (UniqueName: \"kubernetes.io/projected/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-kube-api-access-vsj25\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.329124 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.329155 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7f436461-6865-411d-9c2d-8c5794d1b4ab-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.329173 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ef466a9b-34cc-4282-ad35-96731b58b8c3-auth-proxy-config\") pod \"machine-approver-56656f9798-8qgqp\" (UID: \"ef466a9b-34cc-4282-ad35-96731b58b8c3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8qgqp" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.329193 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ce050654-a42c-4472-9990-581502ae1830-profile-collector-cert\") pod \"catalog-operator-68c6474976-zmfcx\" (UID: \"ce050654-a42c-4472-9990-581502ae1830\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zmfcx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.329213 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b72acca1-8338-4e9c-9fa5-55616766e8a9-node-pullsecrets\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.329232 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b72acca1-8338-4e9c-9fa5-55616766e8a9-encryption-config\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.329277 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-2ggzk" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.329417 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b72acca1-8338-4e9c-9fa5-55616766e8a9-audit\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.329624 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.330249 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b72acca1-8338-4e9c-9fa5-55616766e8a9-image-import-ca\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.330452 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b72acca1-8338-4e9c-9fa5-55616766e8a9-node-pullsecrets\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.330737 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b72acca1-8338-4e9c-9fa5-55616766e8a9-config\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.331285 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b72acca1-8338-4e9c-9fa5-55616766e8a9-trusted-ca-bundle\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.331381 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-65gk2"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.332498 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8j98h"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.333007 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.333426 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.335361 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.335784 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.336869 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-9wz27"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.337086 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8j98h" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.337537 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.340484 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m5hc7"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.340588 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b72acca1-8338-4e9c-9fa5-55616766e8a9-serving-cert\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.340888 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m5hc7" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.345434 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kdk4"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.346466 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kdk4" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.346617 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8ngds"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.364089 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.364964 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b72acca1-8338-4e9c-9fa5-55616766e8a9-etcd-client\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.365188 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8ngds" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.367067 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b72acca1-8338-4e9c-9fa5-55616766e8a9-encryption-config\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.368397 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-t4rf8"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.374158 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.376742 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.378296 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9g9km"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.378440 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-t4rf8" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.381252 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k75vw"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.381498 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.384054 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9jtzv"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.384625 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k75vw" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.389386 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2c9gx"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.389593 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9jtzv" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.390364 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2c9gx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.391105 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.391882 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mz7ld"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.397920 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.399118 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.400393 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fhpzk"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.402086 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fhpzk" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.402944 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kkh7f"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.408061 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vpkgq"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.409146 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z6kn5"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.410271 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.410846 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.414101 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-dfwdz"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.415198 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-94k47"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.416371 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-m2hvn"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.416508 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-94k47" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.417242 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rlp29"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.417361 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m2hvn" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.419524 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sw6q8"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.420580 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8j98h"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.421879 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kdk4"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.423027 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zmfcx"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.424089 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vg8j9"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.425445 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.426530 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9jtzv"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.427845 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sjc2j"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.429756 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-kkdb8"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.430134 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e043e178-e7e5-4ddf-b561-7253433d6e81-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mz7ld\" (UID: \"e043e178-e7e5-4ddf-b561-7253433d6e81\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mz7ld" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.430165 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ce050654-a42c-4472-9990-581502ae1830-srv-cert\") pod \"catalog-operator-68c6474976-zmfcx\" (UID: \"ce050654-a42c-4472-9990-581502ae1830\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zmfcx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.430188 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4f6bb21-bb06-4b91-a1ff-15b596f1f92f-trusted-ca\") pod \"console-operator-58897d9998-vg8j9\" (UID: \"f4f6bb21-bb06-4b91-a1ff-15b596f1f92f\") " pod="openshift-console-operator/console-operator-58897d9998-vg8j9" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.430205 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-audit-policies\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.430778 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.430809 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w7qz\" (UniqueName: \"kubernetes.io/projected/a4782fdd-4348-4995-af8f-eb6d61183dec-kube-api-access-4w7qz\") pod \"cluster-samples-operator-665b6dd947-hcvgc\" (UID: \"a4782fdd-4348-4995-af8f-eb6d61183dec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hcvgc" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.430830 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc8974e3-11f8-4820-ac5b-d70b337ecd4c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-4vlh5\" (UID: \"bc8974e3-11f8-4820-ac5b-d70b337ecd4c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.430846 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/63bdcada-d1ad-45eb-b290-42b2b8dd8257-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z6kn5\" (UID: \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.430864 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ade98cb-8582-4066-b635-e837a190302d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6nsx4\" (UID: \"7ade98cb-8582-4066-b635-e837a190302d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6nsx4" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.430883 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrg44\" (UniqueName: \"kubernetes.io/projected/7ade98cb-8582-4066-b635-e837a190302d-kube-api-access-nrg44\") pod \"cluster-image-registry-operator-dc59b4c8b-6nsx4\" (UID: \"7ade98cb-8582-4066-b635-e837a190302d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6nsx4" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.430905 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b46a19f6-7d04-44b1-a2ad-6146c66fb5e2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7vg2x\" (UID: \"b46a19f6-7d04-44b1-a2ad-6146c66fb5e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.430922 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.430942 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-oauth-serving-cert\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.430962 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dda8f37-91e7-4ddc-bd94-8caa6f422c7c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kkh7f\" (UID: \"3dda8f37-91e7-4ddc-bd94-8caa6f422c7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkh7f" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431035 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e043e178-e7e5-4ddf-b561-7253433d6e81-images\") pod \"machine-api-operator-5694c8668f-mz7ld\" (UID: \"e043e178-e7e5-4ddf-b561-7253433d6e81\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mz7ld" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431057 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bl8d\" (UniqueName: \"kubernetes.io/projected/f4f6bb21-bb06-4b91-a1ff-15b596f1f92f-kube-api-access-6bl8d\") pod \"console-operator-58897d9998-vg8j9\" (UID: \"f4f6bb21-bb06-4b91-a1ff-15b596f1f92f\") " pod="openshift-console-operator/console-operator-58897d9998-vg8j9" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431081 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/534f7f1e-7321-49f1-8d68-7b356c28b8ac-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wwch7\" (UID: \"534f7f1e-7321-49f1-8d68-7b356c28b8ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wwch7" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431100 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4782fdd-4348-4995-af8f-eb6d61183dec-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-hcvgc\" (UID: \"a4782fdd-4348-4995-af8f-eb6d61183dec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hcvgc" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431117 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cznss\" (UniqueName: \"kubernetes.io/projected/3dda8f37-91e7-4ddc-bd94-8caa6f422c7c-kube-api-access-cznss\") pod \"authentication-operator-69f744f599-kkh7f\" (UID: \"3dda8f37-91e7-4ddc-bd94-8caa6f422c7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkh7f" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431195 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-t4rf8"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431304 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-audit-policies\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431355 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aee2ceff-d701-49e2-9dca-8edb0bd1d59e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m5hc7\" (UID: \"aee2ceff-d701-49e2-9dca-8edb0bd1d59e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m5hc7" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431379 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdvvd\" (UniqueName: \"kubernetes.io/projected/7f436461-6865-411d-9c2d-8c5794d1b4ab-kube-api-access-gdvvd\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431402 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0cf8dd-fc29-442b-9ff2-7360946df755-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sjc2j\" (UID: \"8a0cf8dd-fc29-442b-9ff2-7360946df755\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sjc2j" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431421 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-console-config\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431441 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f436461-6865-411d-9c2d-8c5794d1b4ab-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431461 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7f436461-6865-411d-9c2d-8c5794d1b4ab-etcd-client\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431534 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-audit-dir\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431563 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431573 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-config\") pod \"route-controller-manager-6576b87f9c-fgnfx\" (UID: \"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431592 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431612 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431635 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2f5ae1d2-b361-49e2-9460-b447f70a4cd3-etcd-ca\") pod \"etcd-operator-b45778765-65gk2\" (UID: \"2f5ae1d2-b361-49e2-9460-b447f70a4cd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431660 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6flq\" (UniqueName: \"kubernetes.io/projected/e043e178-e7e5-4ddf-b561-7253433d6e81-kube-api-access-t6flq\") pod \"machine-api-operator-5694c8668f-mz7ld\" (UID: \"e043e178-e7e5-4ddf-b561-7253433d6e81\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mz7ld" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431682 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ade98cb-8582-4066-b635-e837a190302d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6nsx4\" (UID: \"7ade98cb-8582-4066-b635-e837a190302d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6nsx4" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431708 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63bdcada-d1ad-45eb-b290-42b2b8dd8257-client-ca\") pod \"controller-manager-879f6c89f-z6kn5\" (UID: \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431757 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb-secret-volume\") pod \"collect-profiles-29396325-swzgq\" (UID: \"ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431790 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dda8f37-91e7-4ddc-bd94-8caa6f422c7c-serving-cert\") pod \"authentication-operator-69f744f599-kkh7f\" (UID: \"3dda8f37-91e7-4ddc-bd94-8caa6f422c7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkh7f" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431797 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4f6bb21-bb06-4b91-a1ff-15b596f1f92f-trusted-ca\") pod \"console-operator-58897d9998-vg8j9\" (UID: \"f4f6bb21-bb06-4b91-a1ff-15b596f1f92f\") " pod="openshift-console-operator/console-operator-58897d9998-vg8j9" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431814 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef466a9b-34cc-4282-ad35-96731b58b8c3-config\") pod \"machine-approver-56656f9798-8qgqp\" (UID: \"ef466a9b-34cc-4282-ad35-96731b58b8c3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8qgqp" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431850 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431883 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/12981135-5b52-464d-8690-e571eb306507-stats-auth\") pod \"router-default-5444994796-9wz27\" (UID: \"12981135-5b52-464d-8690-e571eb306507\") " pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431911 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aee2ceff-d701-49e2-9dca-8edb0bd1d59e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m5hc7\" (UID: \"aee2ceff-d701-49e2-9dca-8edb0bd1d59e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m5hc7" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431938 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n9h6\" (UniqueName: \"kubernetes.io/projected/bc8974e3-11f8-4820-ac5b-d70b337ecd4c-kube-api-access-5n9h6\") pod \"openshift-config-operator-7777fb866f-4vlh5\" (UID: \"bc8974e3-11f8-4820-ac5b-d70b337ecd4c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431965 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9kjf\" (UniqueName: \"kubernetes.io/projected/ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb-kube-api-access-r9kjf\") pod \"collect-profiles-29396325-swzgq\" (UID: \"ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.431995 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432020 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b46a19f6-7d04-44b1-a2ad-6146c66fb5e2-trusted-ca\") pod \"ingress-operator-5b745b69d9-7vg2x\" (UID: \"b46a19f6-7d04-44b1-a2ad-6146c66fb5e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432059 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7f436461-6865-411d-9c2d-8c5794d1b4ab-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432085 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ef466a9b-34cc-4282-ad35-96731b58b8c3-auth-proxy-config\") pod \"machine-approver-56656f9798-8qgqp\" (UID: \"ef466a9b-34cc-4282-ad35-96731b58b8c3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8qgqp" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432110 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ce050654-a42c-4472-9990-581502ae1830-profile-collector-cert\") pod \"catalog-operator-68c6474976-zmfcx\" (UID: \"ce050654-a42c-4472-9990-581502ae1830\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zmfcx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432147 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7f436461-6865-411d-9c2d-8c5794d1b4ab-audit-dir\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432170 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4f6bb21-bb06-4b91-a1ff-15b596f1f92f-config\") pod \"console-operator-58897d9998-vg8j9\" (UID: \"f4f6bb21-bb06-4b91-a1ff-15b596f1f92f\") " pod="openshift-console-operator/console-operator-58897d9998-vg8j9" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432188 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12981135-5b52-464d-8690-e571eb306507-service-ca-bundle\") pod \"router-default-5444994796-9wz27\" (UID: \"12981135-5b52-464d-8690-e571eb306507\") " pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432205 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-trusted-ca-bundle\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432221 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7c7c8cee-9a36-4dad-9a04-3c30fe4a7bed-signing-cabundle\") pod \"service-ca-9c57cc56f-2ggzk\" (UID: \"7c7c8cee-9a36-4dad-9a04-3c30fe4a7bed\") " pod="openshift-service-ca/service-ca-9c57cc56f-2ggzk" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432242 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-client-ca\") pod \"route-controller-manager-6576b87f9c-fgnfx\" (UID: \"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432259 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432280 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcc45\" (UniqueName: \"kubernetes.io/projected/2f5ae1d2-b361-49e2-9460-b447f70a4cd3-kube-api-access-tcc45\") pod \"etcd-operator-b45778765-65gk2\" (UID: \"2f5ae1d2-b361-49e2-9460-b447f70a4cd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432298 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4f6bb21-bb06-4b91-a1ff-15b596f1f92f-serving-cert\") pod \"console-operator-58897d9998-vg8j9\" (UID: \"f4f6bb21-bb06-4b91-a1ff-15b596f1f92f\") " pod="openshift-console-operator/console-operator-58897d9998-vg8j9" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432317 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7f436461-6865-411d-9c2d-8c5794d1b4ab-encryption-config\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432322 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0cf8dd-fc29-442b-9ff2-7360946df755-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sjc2j\" (UID: \"8a0cf8dd-fc29-442b-9ff2-7360946df755\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sjc2j" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432336 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc8974e3-11f8-4820-ac5b-d70b337ecd4c-serving-cert\") pod \"openshift-config-operator-7777fb866f-4vlh5\" (UID: \"bc8974e3-11f8-4820-ac5b-d70b337ecd4c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432379 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb-config-volume\") pod \"collect-profiles-29396325-swzgq\" (UID: \"ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432409 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef466a9b-34cc-4282-ad35-96731b58b8c3-config\") pod \"machine-approver-56656f9798-8qgqp\" (UID: \"ef466a9b-34cc-4282-ad35-96731b58b8c3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8qgqp" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432421 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a0cf8dd-fc29-442b-9ff2-7360946df755-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sjc2j\" (UID: \"8a0cf8dd-fc29-442b-9ff2-7360946df755\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sjc2j" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432445 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcf88\" (UniqueName: \"kubernetes.io/projected/63bdcada-d1ad-45eb-b290-42b2b8dd8257-kube-api-access-fcf88\") pod \"controller-manager-879f6c89f-z6kn5\" (UID: \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432469 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-console-oauth-config\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432491 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb9f7\" (UniqueName: \"kubernetes.io/projected/7c7c8cee-9a36-4dad-9a04-3c30fe4a7bed-kube-api-access-pb9f7\") pod \"service-ca-9c57cc56f-2ggzk\" (UID: \"7c7c8cee-9a36-4dad-9a04-3c30fe4a7bed\") " pod="openshift-service-ca/service-ca-9c57cc56f-2ggzk" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432516 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2f5ae1d2-b361-49e2-9460-b447f70a4cd3-etcd-service-ca\") pod \"etcd-operator-b45778765-65gk2\" (UID: \"2f5ae1d2-b361-49e2-9460-b447f70a4cd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432535 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2f5ae1d2-b361-49e2-9460-b447f70a4cd3-etcd-client\") pod \"etcd-operator-b45778765-65gk2\" (UID: \"2f5ae1d2-b361-49e2-9460-b447f70a4cd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432580 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432607 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f5ae1d2-b361-49e2-9460-b447f70a4cd3-config\") pod \"etcd-operator-b45778765-65gk2\" (UID: \"2f5ae1d2-b361-49e2-9460-b447f70a4cd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432629 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63bdcada-d1ad-45eb-b290-42b2b8dd8257-config\") pod \"controller-manager-879f6c89f-z6kn5\" (UID: \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432654 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj8vq\" (UniqueName: \"kubernetes.io/projected/ce050654-a42c-4472-9990-581502ae1830-kube-api-access-rj8vq\") pod \"catalog-operator-68c6474976-zmfcx\" (UID: \"ce050654-a42c-4472-9990-581502ae1830\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zmfcx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432674 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxrrj\" (UniqueName: \"kubernetes.io/projected/ef466a9b-34cc-4282-ad35-96731b58b8c3-kube-api-access-bxrrj\") pod \"machine-approver-56656f9798-8qgqp\" (UID: \"ef466a9b-34cc-4282-ad35-96731b58b8c3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8qgqp" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432696 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr5xv\" (UniqueName: \"kubernetes.io/projected/534f7f1e-7321-49f1-8d68-7b356c28b8ac-kube-api-access-rr5xv\") pod \"openshift-apiserver-operator-796bbdcf4f-wwch7\" (UID: \"534f7f1e-7321-49f1-8d68-7b356c28b8ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wwch7" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432716 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7f436461-6865-411d-9c2d-8c5794d1b4ab-audit-policies\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432736 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12981135-5b52-464d-8690-e571eb306507-metrics-certs\") pod \"router-default-5444994796-9wz27\" (UID: \"12981135-5b52-464d-8690-e571eb306507\") " pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432682 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6nsx4"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432792 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/85eed36b-26fa-4c39-a899-23262c4c1043-proxy-tls\") pod \"machine-config-operator-74547568cd-nbpml\" (UID: \"85eed36b-26fa-4c39-a899-23262c4c1043\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432899 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e043e178-e7e5-4ddf-b561-7253433d6e81-images\") pod \"machine-api-operator-5694c8668f-mz7ld\" (UID: \"e043e178-e7e5-4ddf-b561-7253433d6e81\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mz7ld" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.432948 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f436461-6865-411d-9c2d-8c5794d1b4ab-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.433416 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc8974e3-11f8-4820-ac5b-d70b337ecd4c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-4vlh5\" (UID: \"bc8974e3-11f8-4820-ac5b-d70b337ecd4c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.433618 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-audit-dir\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.434044 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-client-ca\") pod \"route-controller-manager-6576b87f9c-fgnfx\" (UID: \"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.434503 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/534f7f1e-7321-49f1-8d68-7b356c28b8ac-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wwch7\" (UID: \"534f7f1e-7321-49f1-8d68-7b356c28b8ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wwch7" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.434620 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/63bdcada-d1ad-45eb-b290-42b2b8dd8257-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z6kn5\" (UID: \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.434998 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-config\") pod \"route-controller-manager-6576b87f9c-fgnfx\" (UID: \"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.435862 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc8974e3-11f8-4820-ac5b-d70b337ecd4c-serving-cert\") pod \"openshift-config-operator-7777fb866f-4vlh5\" (UID: \"bc8974e3-11f8-4820-ac5b-d70b337ecd4c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.435949 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.436054 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7f436461-6865-411d-9c2d-8c5794d1b4ab-audit-dir\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.436260 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ade98cb-8582-4066-b635-e837a190302d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6nsx4\" (UID: \"7ade98cb-8582-4066-b635-e837a190302d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6nsx4" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.436285 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.436532 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqsch\" (UniqueName: \"kubernetes.io/projected/12981135-5b52-464d-8690-e571eb306507-kube-api-access-qqsch\") pod \"router-default-5444994796-9wz27\" (UID: \"12981135-5b52-464d-8690-e571eb306507\") " pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.436640 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-console-serving-cert\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.436729 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/85eed36b-26fa-4c39-a899-23262c4c1043-images\") pod \"machine-config-operator-74547568cd-nbpml\" (UID: \"85eed36b-26fa-4c39-a899-23262c4c1043\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.436812 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc27v\" (UniqueName: \"kubernetes.io/projected/b46a19f6-7d04-44b1-a2ad-6146c66fb5e2-kube-api-access-lc27v\") pod \"ingress-operator-5b745b69d9-7vg2x\" (UID: \"b46a19f6-7d04-44b1-a2ad-6146c66fb5e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.436896 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-serving-cert\") pod \"route-controller-manager-6576b87f9c-fgnfx\" (UID: \"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.436538 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7f436461-6865-411d-9c2d-8c5794d1b4ab-audit-policies\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.436843 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4f6bb21-bb06-4b91-a1ff-15b596f1f92f-config\") pod \"console-operator-58897d9998-vg8j9\" (UID: \"f4f6bb21-bb06-4b91-a1ff-15b596f1f92f\") " pod="openshift-console-operator/console-operator-58897d9998-vg8j9" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437214 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgswf\" (UniqueName: \"kubernetes.io/projected/7f15cd27-e587-4db6-8fcd-b5b2cd559656-kube-api-access-fgswf\") pod \"package-server-manager-789f6589d5-sw6q8\" (UID: \"7f15cd27-e587-4db6-8fcd-b5b2cd559656\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sw6q8" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437275 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/534f7f1e-7321-49f1-8d68-7b356c28b8ac-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wwch7\" (UID: \"534f7f1e-7321-49f1-8d68-7b356c28b8ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wwch7" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437308 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f15cd27-e587-4db6-8fcd-b5b2cd559656-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-sw6q8\" (UID: \"7f15cd27-e587-4db6-8fcd-b5b2cd559656\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sw6q8" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437337 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437365 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dda8f37-91e7-4ddc-bd94-8caa6f422c7c-config\") pod \"authentication-operator-69f744f599-kkh7f\" (UID: \"3dda8f37-91e7-4ddc-bd94-8caa6f422c7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkh7f" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437470 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e043e178-e7e5-4ddf-b561-7253433d6e81-config\") pod \"machine-api-operator-5694c8668f-mz7ld\" (UID: \"e043e178-e7e5-4ddf-b561-7253433d6e81\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mz7ld" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437496 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b46a19f6-7d04-44b1-a2ad-6146c66fb5e2-metrics-tls\") pod \"ingress-operator-5b745b69d9-7vg2x\" (UID: \"b46a19f6-7d04-44b1-a2ad-6146c66fb5e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437534 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ef466a9b-34cc-4282-ad35-96731b58b8c3-machine-approver-tls\") pod \"machine-approver-56656f9798-8qgqp\" (UID: \"ef466a9b-34cc-4282-ad35-96731b58b8c3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8qgqp" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437596 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-service-ca\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437496 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dda8f37-91e7-4ddc-bd94-8caa6f422c7c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kkh7f\" (UID: \"3dda8f37-91e7-4ddc-bd94-8caa6f422c7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkh7f" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437685 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsj25\" (UniqueName: \"kubernetes.io/projected/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-kube-api-access-vsj25\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437712 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9aae38d4-efc5-4f2b-acdf-0d3a607b54a9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5gwx5\" (UID: \"9aae38d4-efc5-4f2b-acdf-0d3a607b54a9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5gwx5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437743 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/85eed36b-26fa-4c39-a899-23262c4c1043-auth-proxy-config\") pod \"machine-config-operator-74547568cd-nbpml\" (UID: \"85eed36b-26fa-4c39-a899-23262c4c1043\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437796 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs5ml\" (UniqueName: \"kubernetes.io/projected/9aae38d4-efc5-4f2b-acdf-0d3a607b54a9-kube-api-access-bs5ml\") pod \"multus-admission-controller-857f4d67dd-5gwx5\" (UID: \"9aae38d4-efc5-4f2b-acdf-0d3a607b54a9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5gwx5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437845 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/12981135-5b52-464d-8690-e571eb306507-default-certificate\") pod \"router-default-5444994796-9wz27\" (UID: \"12981135-5b52-464d-8690-e571eb306507\") " pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437871 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7nm8\" (UniqueName: \"kubernetes.io/projected/af11e3ed-3c58-4ad5-9da7-38b9950ff726-kube-api-access-f7nm8\") pod \"downloads-7954f5f757-kkdb8\" (UID: \"af11e3ed-3c58-4ad5-9da7-38b9950ff726\") " pod="openshift-console/downloads-7954f5f757-kkdb8" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437907 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437922 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437928 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7c7c8cee-9a36-4dad-9a04-3c30fe4a7bed-signing-key\") pod \"service-ca-9c57cc56f-2ggzk\" (UID: \"7c7c8cee-9a36-4dad-9a04-3c30fe4a7bed\") " pod="openshift-service-ca/service-ca-9c57cc56f-2ggzk" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437977 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dda8f37-91e7-4ddc-bd94-8caa6f422c7c-config\") pod \"authentication-operator-69f744f599-kkh7f\" (UID: \"3dda8f37-91e7-4ddc-bd94-8caa6f422c7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkh7f" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.437982 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f436461-6865-411d-9c2d-8c5794d1b4ab-serving-cert\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.438023 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dda8f37-91e7-4ddc-bd94-8caa6f422c7c-service-ca-bundle\") pod \"authentication-operator-69f744f599-kkh7f\" (UID: \"3dda8f37-91e7-4ddc-bd94-8caa6f422c7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkh7f" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.438049 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-864dz\" (UniqueName: \"kubernetes.io/projected/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-kube-api-access-864dz\") pod \"route-controller-manager-6576b87f9c-fgnfx\" (UID: \"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.438092 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpd97\" (UniqueName: \"kubernetes.io/projected/8a0cf8dd-fc29-442b-9ff2-7360946df755-kube-api-access-fpd97\") pod \"openshift-controller-manager-operator-756b6f6bc6-sjc2j\" (UID: \"8a0cf8dd-fc29-442b-9ff2-7360946df755\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sjc2j" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.438120 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aee2ceff-d701-49e2-9dca-8edb0bd1d59e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m5hc7\" (UID: \"aee2ceff-d701-49e2-9dca-8edb0bd1d59e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m5hc7" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.438139 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ztvq\" (UniqueName: \"kubernetes.io/projected/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-kube-api-access-5ztvq\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.438158 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f5ae1d2-b361-49e2-9460-b447f70a4cd3-serving-cert\") pod \"etcd-operator-b45778765-65gk2\" (UID: \"2f5ae1d2-b361-49e2-9460-b447f70a4cd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.438178 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh6jb\" (UniqueName: \"kubernetes.io/projected/85eed36b-26fa-4c39-a899-23262c4c1043-kube-api-access-nh6jb\") pod \"machine-config-operator-74547568cd-nbpml\" (UID: \"85eed36b-26fa-4c39-a899-23262c4c1043\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.438197 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c8b36a8f-760f-47c0-a090-c1f8c8ac44c5-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-k75vw\" (UID: \"c8b36a8f-760f-47c0-a090-c1f8c8ac44c5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k75vw" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.438219 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ade98cb-8582-4066-b635-e837a190302d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6nsx4\" (UID: \"7ade98cb-8582-4066-b635-e837a190302d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6nsx4" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.438273 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63bdcada-d1ad-45eb-b290-42b2b8dd8257-config\") pod \"controller-manager-879f6c89f-z6kn5\" (UID: \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.438300 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63bdcada-d1ad-45eb-b290-42b2b8dd8257-serving-cert\") pod \"controller-manager-879f6c89f-z6kn5\" (UID: \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.438322 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw2bs\" (UniqueName: \"kubernetes.io/projected/c8b36a8f-760f-47c0-a090-c1f8c8ac44c5-kube-api-access-xw2bs\") pod \"control-plane-machine-set-operator-78cbb6b69f-k75vw\" (UID: \"c8b36a8f-760f-47c0-a090-c1f8c8ac44c5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k75vw" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.438343 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.438465 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63bdcada-d1ad-45eb-b290-42b2b8dd8257-client-ca\") pod \"controller-manager-879f6c89f-z6kn5\" (UID: \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.438788 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ef466a9b-34cc-4282-ad35-96731b58b8c3-auth-proxy-config\") pod \"machine-approver-56656f9798-8qgqp\" (UID: \"ef466a9b-34cc-4282-ad35-96731b58b8c3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8qgqp" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.439090 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.439187 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.439228 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e043e178-e7e5-4ddf-b561-7253433d6e81-config\") pod \"machine-api-operator-5694c8668f-mz7ld\" (UID: \"e043e178-e7e5-4ddf-b561-7253433d6e81\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mz7ld" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.439450 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7f436461-6865-411d-9c2d-8c5794d1b4ab-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.440196 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dda8f37-91e7-4ddc-bd94-8caa6f422c7c-service-ca-bundle\") pod \"authentication-operator-69f744f599-kkh7f\" (UID: \"3dda8f37-91e7-4ddc-bd94-8caa6f422c7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkh7f" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.440807 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.440823 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a0cf8dd-fc29-442b-9ff2-7360946df755-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sjc2j\" (UID: \"8a0cf8dd-fc29-442b-9ff2-7360946df755\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sjc2j" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.441092 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.441115 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-serving-cert\") pod \"route-controller-manager-6576b87f9c-fgnfx\" (UID: \"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.441293 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4782fdd-4348-4995-af8f-eb6d61183dec-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-hcvgc\" (UID: \"a4782fdd-4348-4995-af8f-eb6d61183dec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hcvgc" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.441295 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e043e178-e7e5-4ddf-b561-7253433d6e81-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mz7ld\" (UID: \"e043e178-e7e5-4ddf-b561-7253433d6e81\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mz7ld" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.441784 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/534f7f1e-7321-49f1-8d68-7b356c28b8ac-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wwch7\" (UID: \"534f7f1e-7321-49f1-8d68-7b356c28b8ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wwch7" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.441847 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7f436461-6865-411d-9c2d-8c5794d1b4ab-etcd-client\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.442142 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.442925 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.442996 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.443015 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ade98cb-8582-4066-b635-e837a190302d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6nsx4\" (UID: \"7ade98cb-8582-4066-b635-e837a190302d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6nsx4" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.443132 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dda8f37-91e7-4ddc-bd94-8caa6f422c7c-serving-cert\") pod \"authentication-operator-69f744f599-kkh7f\" (UID: \"3dda8f37-91e7-4ddc-bd94-8caa6f422c7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkh7f" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.443199 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4f6bb21-bb06-4b91-a1ff-15b596f1f92f-serving-cert\") pod \"console-operator-58897d9998-vg8j9\" (UID: \"f4f6bb21-bb06-4b91-a1ff-15b596f1f92f\") " pod="openshift-console-operator/console-operator-58897d9998-vg8j9" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.443341 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.443399 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ce050654-a42c-4472-9990-581502ae1830-srv-cert\") pod \"catalog-operator-68c6474976-zmfcx\" (UID: \"ce050654-a42c-4472-9990-581502ae1830\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zmfcx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.443745 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7f436461-6865-411d-9c2d-8c5794d1b4ab-encryption-config\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.443917 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.444111 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hcvgc"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.444605 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ef466a9b-34cc-4282-ad35-96731b58b8c3-machine-approver-tls\") pod \"machine-approver-56656f9798-8qgqp\" (UID: \"ef466a9b-34cc-4282-ad35-96731b58b8c3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8qgqp" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.445299 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m5hc7"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.446369 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.446730 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5gwx5"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.447510 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ce050654-a42c-4472-9990-581502ae1830-profile-collector-cert\") pod \"catalog-operator-68c6474976-zmfcx\" (UID: \"ce050654-a42c-4472-9990-581502ae1830\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zmfcx" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.448190 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p9bf7"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.449172 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-2ggzk"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.451943 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.452245 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.456693 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wwch7"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.456760 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.458287 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9g9km"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.460740 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k75vw"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.460966 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63bdcada-d1ad-45eb-b290-42b2b8dd8257-serving-cert\") pod \"controller-manager-879f6c89f-z6kn5\" (UID: \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.462592 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2c9gx"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.464452 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8ngds"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.466239 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-89rxq"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.467705 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.469511 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-65gk2"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.470635 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jgbvn"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.471301 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.472188 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m2hvn"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.472955 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-hh2wf"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.473977 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hh2wf" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.474387 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fhpzk"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.475688 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-94k47"] Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.500619 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.512210 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.532124 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.538946 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-service-ca\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.538978 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/85eed36b-26fa-4c39-a899-23262c4c1043-auth-proxy-config\") pod \"machine-config-operator-74547568cd-nbpml\" (UID: \"85eed36b-26fa-4c39-a899-23262c4c1043\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539105 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9aae38d4-efc5-4f2b-acdf-0d3a607b54a9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5gwx5\" (UID: \"9aae38d4-efc5-4f2b-acdf-0d3a607b54a9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5gwx5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539128 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs5ml\" (UniqueName: \"kubernetes.io/projected/9aae38d4-efc5-4f2b-acdf-0d3a607b54a9-kube-api-access-bs5ml\") pod \"multus-admission-controller-857f4d67dd-5gwx5\" (UID: \"9aae38d4-efc5-4f2b-acdf-0d3a607b54a9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5gwx5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539146 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/12981135-5b52-464d-8690-e571eb306507-default-certificate\") pod \"router-default-5444994796-9wz27\" (UID: \"12981135-5b52-464d-8690-e571eb306507\") " pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539183 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7c7c8cee-9a36-4dad-9a04-3c30fe4a7bed-signing-key\") pod \"service-ca-9c57cc56f-2ggzk\" (UID: \"7c7c8cee-9a36-4dad-9a04-3c30fe4a7bed\") " pod="openshift-service-ca/service-ca-9c57cc56f-2ggzk" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539216 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aee2ceff-d701-49e2-9dca-8edb0bd1d59e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m5hc7\" (UID: \"aee2ceff-d701-49e2-9dca-8edb0bd1d59e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m5hc7" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539233 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ztvq\" (UniqueName: \"kubernetes.io/projected/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-kube-api-access-5ztvq\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539250 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f5ae1d2-b361-49e2-9460-b447f70a4cd3-serving-cert\") pod \"etcd-operator-b45778765-65gk2\" (UID: \"2f5ae1d2-b361-49e2-9460-b447f70a4cd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539268 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh6jb\" (UniqueName: \"kubernetes.io/projected/85eed36b-26fa-4c39-a899-23262c4c1043-kube-api-access-nh6jb\") pod \"machine-config-operator-74547568cd-nbpml\" (UID: \"85eed36b-26fa-4c39-a899-23262c4c1043\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539287 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c8b36a8f-760f-47c0-a090-c1f8c8ac44c5-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-k75vw\" (UID: \"c8b36a8f-760f-47c0-a090-c1f8c8ac44c5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k75vw" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539317 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw2bs\" (UniqueName: \"kubernetes.io/projected/c8b36a8f-760f-47c0-a090-c1f8c8ac44c5-kube-api-access-xw2bs\") pod \"control-plane-machine-set-operator-78cbb6b69f-k75vw\" (UID: \"c8b36a8f-760f-47c0-a090-c1f8c8ac44c5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k75vw" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539348 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b46a19f6-7d04-44b1-a2ad-6146c66fb5e2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7vg2x\" (UID: \"b46a19f6-7d04-44b1-a2ad-6146c66fb5e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539393 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-oauth-serving-cert\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539425 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aee2ceff-d701-49e2-9dca-8edb0bd1d59e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m5hc7\" (UID: \"aee2ceff-d701-49e2-9dca-8edb0bd1d59e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m5hc7" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539449 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-console-config\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539476 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2f5ae1d2-b361-49e2-9460-b447f70a4cd3-etcd-ca\") pod \"etcd-operator-b45778765-65gk2\" (UID: \"2f5ae1d2-b361-49e2-9460-b447f70a4cd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539501 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb-secret-volume\") pod \"collect-profiles-29396325-swzgq\" (UID: \"ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539519 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aee2ceff-d701-49e2-9dca-8edb0bd1d59e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m5hc7\" (UID: \"aee2ceff-d701-49e2-9dca-8edb0bd1d59e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m5hc7" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539535 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/12981135-5b52-464d-8690-e571eb306507-stats-auth\") pod \"router-default-5444994796-9wz27\" (UID: \"12981135-5b52-464d-8690-e571eb306507\") " pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539572 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9kjf\" (UniqueName: \"kubernetes.io/projected/ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb-kube-api-access-r9kjf\") pod \"collect-profiles-29396325-swzgq\" (UID: \"ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539592 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b46a19f6-7d04-44b1-a2ad-6146c66fb5e2-trusted-ca\") pod \"ingress-operator-5b745b69d9-7vg2x\" (UID: \"b46a19f6-7d04-44b1-a2ad-6146c66fb5e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539617 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-trusted-ca-bundle\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539632 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7c7c8cee-9a36-4dad-9a04-3c30fe4a7bed-signing-cabundle\") pod \"service-ca-9c57cc56f-2ggzk\" (UID: \"7c7c8cee-9a36-4dad-9a04-3c30fe4a7bed\") " pod="openshift-service-ca/service-ca-9c57cc56f-2ggzk" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539657 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12981135-5b52-464d-8690-e571eb306507-service-ca-bundle\") pod \"router-default-5444994796-9wz27\" (UID: \"12981135-5b52-464d-8690-e571eb306507\") " pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539675 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcc45\" (UniqueName: \"kubernetes.io/projected/2f5ae1d2-b361-49e2-9460-b447f70a4cd3-kube-api-access-tcc45\") pod \"etcd-operator-b45778765-65gk2\" (UID: \"2f5ae1d2-b361-49e2-9460-b447f70a4cd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539692 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb-config-volume\") pod \"collect-profiles-29396325-swzgq\" (UID: \"ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539709 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb9f7\" (UniqueName: \"kubernetes.io/projected/7c7c8cee-9a36-4dad-9a04-3c30fe4a7bed-kube-api-access-pb9f7\") pod \"service-ca-9c57cc56f-2ggzk\" (UID: \"7c7c8cee-9a36-4dad-9a04-3c30fe4a7bed\") " pod="openshift-service-ca/service-ca-9c57cc56f-2ggzk" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539730 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-console-oauth-config\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539745 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2f5ae1d2-b361-49e2-9460-b447f70a4cd3-etcd-service-ca\") pod \"etcd-operator-b45778765-65gk2\" (UID: \"2f5ae1d2-b361-49e2-9460-b447f70a4cd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539760 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2f5ae1d2-b361-49e2-9460-b447f70a4cd3-etcd-client\") pod \"etcd-operator-b45778765-65gk2\" (UID: \"2f5ae1d2-b361-49e2-9460-b447f70a4cd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539775 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f5ae1d2-b361-49e2-9460-b447f70a4cd3-config\") pod \"etcd-operator-b45778765-65gk2\" (UID: \"2f5ae1d2-b361-49e2-9460-b447f70a4cd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539808 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12981135-5b52-464d-8690-e571eb306507-metrics-certs\") pod \"router-default-5444994796-9wz27\" (UID: \"12981135-5b52-464d-8690-e571eb306507\") " pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539828 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/85eed36b-26fa-4c39-a899-23262c4c1043-proxy-tls\") pod \"machine-config-operator-74547568cd-nbpml\" (UID: \"85eed36b-26fa-4c39-a899-23262c4c1043\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539850 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqsch\" (UniqueName: \"kubernetes.io/projected/12981135-5b52-464d-8690-e571eb306507-kube-api-access-qqsch\") pod \"router-default-5444994796-9wz27\" (UID: \"12981135-5b52-464d-8690-e571eb306507\") " pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539867 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-console-serving-cert\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539884 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/85eed36b-26fa-4c39-a899-23262c4c1043-images\") pod \"machine-config-operator-74547568cd-nbpml\" (UID: \"85eed36b-26fa-4c39-a899-23262c4c1043\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539899 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc27v\" (UniqueName: \"kubernetes.io/projected/b46a19f6-7d04-44b1-a2ad-6146c66fb5e2-kube-api-access-lc27v\") pod \"ingress-operator-5b745b69d9-7vg2x\" (UID: \"b46a19f6-7d04-44b1-a2ad-6146c66fb5e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539917 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgswf\" (UniqueName: \"kubernetes.io/projected/7f15cd27-e587-4db6-8fcd-b5b2cd559656-kube-api-access-fgswf\") pod \"package-server-manager-789f6589d5-sw6q8\" (UID: \"7f15cd27-e587-4db6-8fcd-b5b2cd559656\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sw6q8" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539935 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f15cd27-e587-4db6-8fcd-b5b2cd559656-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-sw6q8\" (UID: \"7f15cd27-e587-4db6-8fcd-b5b2cd559656\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sw6q8" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539954 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b46a19f6-7d04-44b1-a2ad-6146c66fb5e2-metrics-tls\") pod \"ingress-operator-5b745b69d9-7vg2x\" (UID: \"b46a19f6-7d04-44b1-a2ad-6146c66fb5e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.539961 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/85eed36b-26fa-4c39-a899-23262c4c1043-auth-proxy-config\") pod \"machine-config-operator-74547568cd-nbpml\" (UID: \"85eed36b-26fa-4c39-a899-23262c4c1043\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.540678 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-oauth-serving-cert\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.540707 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-service-ca\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.542366 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-trusted-ca-bundle\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.542773 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-console-config\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.544897 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b46a19f6-7d04-44b1-a2ad-6146c66fb5e2-metrics-tls\") pod \"ingress-operator-5b745b69d9-7vg2x\" (UID: \"b46a19f6-7d04-44b1-a2ad-6146c66fb5e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.545326 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb-secret-volume\") pod \"collect-profiles-29396325-swzgq\" (UID: \"ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.545594 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-console-serving-cert\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.546909 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-console-oauth-config\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.552389 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.553426 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b46a19f6-7d04-44b1-a2ad-6146c66fb5e2-trusted-ca\") pod \"ingress-operator-5b745b69d9-7vg2x\" (UID: \"b46a19f6-7d04-44b1-a2ad-6146c66fb5e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.554432 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f436461-6865-411d-9c2d-8c5794d1b4ab-serving-cert\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.573039 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.592079 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.632392 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.652041 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.671516 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.690948 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.712065 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.732137 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.766179 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.773337 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.773788 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/85eed36b-26fa-4c39-a899-23262c4c1043-images\") pod \"machine-config-operator-74547568cd-nbpml\" (UID: \"85eed36b-26fa-4c39-a899-23262c4c1043\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.775791 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9aae38d4-efc5-4f2b-acdf-0d3a607b54a9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5gwx5\" (UID: \"9aae38d4-efc5-4f2b-acdf-0d3a607b54a9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5gwx5" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.793097 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.811924 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.826478 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/85eed36b-26fa-4c39-a899-23262c4c1043-proxy-tls\") pod \"machine-config-operator-74547568cd-nbpml\" (UID: \"85eed36b-26fa-4c39-a899-23262c4c1043\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.831827 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.844197 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f15cd27-e587-4db6-8fcd-b5b2cd559656-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-sw6q8\" (UID: \"7f15cd27-e587-4db6-8fcd-b5b2cd559656\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sw6q8" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.864640 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mskvt\" (UniqueName: \"kubernetes.io/projected/b72acca1-8338-4e9c-9fa5-55616766e8a9-kube-api-access-mskvt\") pod \"apiserver-76f77b778f-dfwdz\" (UID: \"b72acca1-8338-4e9c-9fa5-55616766e8a9\") " pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.872988 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.892283 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.901838 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7c7c8cee-9a36-4dad-9a04-3c30fe4a7bed-signing-cabundle\") pod \"service-ca-9c57cc56f-2ggzk\" (UID: \"7c7c8cee-9a36-4dad-9a04-3c30fe4a7bed\") " pod="openshift-service-ca/service-ca-9c57cc56f-2ggzk" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.912893 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.932103 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.951960 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.972699 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 22 02:56:16 crc kubenswrapper[4952]: I1122 02:56:16.992171 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.006891 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7c7c8cee-9a36-4dad-9a04-3c30fe4a7bed-signing-key\") pod \"service-ca-9c57cc56f-2ggzk\" (UID: \"7c7c8cee-9a36-4dad-9a04-3c30fe4a7bed\") " pod="openshift-service-ca/service-ca-9c57cc56f-2ggzk" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.013097 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.032717 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.044535 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2f5ae1d2-b361-49e2-9460-b447f70a4cd3-etcd-client\") pod \"etcd-operator-b45778765-65gk2\" (UID: \"2f5ae1d2-b361-49e2-9460-b447f70a4cd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.051941 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.061678 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2f5ae1d2-b361-49e2-9460-b447f70a4cd3-etcd-ca\") pod \"etcd-operator-b45778765-65gk2\" (UID: \"2f5ae1d2-b361-49e2-9460-b447f70a4cd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.072926 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.093318 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.105942 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f5ae1d2-b361-49e2-9460-b447f70a4cd3-serving-cert\") pod \"etcd-operator-b45778765-65gk2\" (UID: \"2f5ae1d2-b361-49e2-9460-b447f70a4cd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.112450 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.112996 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.122509 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f5ae1d2-b361-49e2-9460-b447f70a4cd3-config\") pod \"etcd-operator-b45778765-65gk2\" (UID: \"2f5ae1d2-b361-49e2-9460-b447f70a4cd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.133325 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.153289 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.172683 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.182143 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2f5ae1d2-b361-49e2-9460-b447f70a4cd3-etcd-service-ca\") pod \"etcd-operator-b45778765-65gk2\" (UID: \"2f5ae1d2-b361-49e2-9460-b447f70a4cd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.192929 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.201575 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb-config-volume\") pod \"collect-profiles-29396325-swzgq\" (UID: \"ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.212203 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.234192 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.252093 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.272780 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.293117 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.312321 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.320533 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12981135-5b52-464d-8690-e571eb306507-metrics-certs\") pod \"router-default-5444994796-9wz27\" (UID: \"12981135-5b52-464d-8690-e571eb306507\") " pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.332044 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.351673 4952 request.go:700] Waited for 1.013638507s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/secrets?fieldSelector=metadata.name%3Drouter-certs-default&limit=500&resourceVersion=0 Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.354200 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.367086 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/12981135-5b52-464d-8690-e571eb306507-default-certificate\") pod \"router-default-5444994796-9wz27\" (UID: \"12981135-5b52-464d-8690-e571eb306507\") " pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.371451 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.391643 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-dfwdz"] Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.392154 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.392796 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/12981135-5b52-464d-8690-e571eb306507-stats-auth\") pod \"router-default-5444994796-9wz27\" (UID: \"12981135-5b52-464d-8690-e571eb306507\") " pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.402473 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12981135-5b52-464d-8690-e571eb306507-service-ca-bundle\") pod \"router-default-5444994796-9wz27\" (UID: \"12981135-5b52-464d-8690-e571eb306507\") " pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.413206 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.432649 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.452933 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.473741 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.485480 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aee2ceff-d701-49e2-9dca-8edb0bd1d59e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m5hc7\" (UID: \"aee2ceff-d701-49e2-9dca-8edb0bd1d59e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m5hc7" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.492036 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.501129 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aee2ceff-d701-49e2-9dca-8edb0bd1d59e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m5hc7\" (UID: \"aee2ceff-d701-49e2-9dca-8edb0bd1d59e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m5hc7" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.512145 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.532643 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 22 02:56:17 crc kubenswrapper[4952]: E1122 02:56:17.541123 4952 secret.go:188] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Nov 22 02:56:17 crc kubenswrapper[4952]: E1122 02:56:17.541234 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8b36a8f-760f-47c0-a090-c1f8c8ac44c5-control-plane-machine-set-operator-tls podName:c8b36a8f-760f-47c0-a090-c1f8c8ac44c5 nodeName:}" failed. No retries permitted until 2025-11-22 02:56:18.041206107 +0000 UTC m=+142.347223410 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/c8b36a8f-760f-47c0-a090-c1f8c8ac44c5-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-78cbb6b69f-k75vw" (UID: "c8b36a8f-760f-47c0-a090-c1f8c8ac44c5") : failed to sync secret cache: timed out waiting for the condition Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.552285 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.572743 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.592660 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.612457 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.632575 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.651971 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.673687 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.691861 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.713166 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.732667 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.752006 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.784253 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.793987 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.812039 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.832489 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.852635 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.872700 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.892455 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.912981 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.932743 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.952901 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 22 02:56:17 crc kubenswrapper[4952]: I1122 02:56:17.971994 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.013073 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.032739 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.052975 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.072034 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.083966 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c8b36a8f-760f-47c0-a090-c1f8c8ac44c5-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-k75vw\" (UID: \"c8b36a8f-760f-47c0-a090-c1f8c8ac44c5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k75vw" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.091263 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c8b36a8f-760f-47c0-a090-c1f8c8ac44c5-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-k75vw\" (UID: \"c8b36a8f-760f-47c0-a090-c1f8c8ac44c5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k75vw" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.092041 4952 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.112215 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.132619 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.153241 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.172616 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.192521 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.239694 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bl8d\" (UniqueName: \"kubernetes.io/projected/f4f6bb21-bb06-4b91-a1ff-15b596f1f92f-kube-api-access-6bl8d\") pod \"console-operator-58897d9998-vg8j9\" (UID: \"f4f6bb21-bb06-4b91-a1ff-15b596f1f92f\") " pod="openshift-console-operator/console-operator-58897d9998-vg8j9" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.248690 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w7qz\" (UniqueName: \"kubernetes.io/projected/a4782fdd-4348-4995-af8f-eb6d61183dec-kube-api-access-4w7qz\") pod \"cluster-samples-operator-665b6dd947-hcvgc\" (UID: \"a4782fdd-4348-4995-af8f-eb6d61183dec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hcvgc" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.274792 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cznss\" (UniqueName: \"kubernetes.io/projected/3dda8f37-91e7-4ddc-bd94-8caa6f422c7c-kube-api-access-cznss\") pod \"authentication-operator-69f744f599-kkh7f\" (UID: \"3dda8f37-91e7-4ddc-bd94-8caa6f422c7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kkh7f" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.290221 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vg8j9" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.306968 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcf88\" (UniqueName: \"kubernetes.io/projected/63bdcada-d1ad-45eb-b290-42b2b8dd8257-kube-api-access-fcf88\") pod \"controller-manager-879f6c89f-z6kn5\" (UID: \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.317320 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n9h6\" (UniqueName: \"kubernetes.io/projected/bc8974e3-11f8-4820-ac5b-d70b337ecd4c-kube-api-access-5n9h6\") pod \"openshift-config-operator-7777fb866f-4vlh5\" (UID: \"bc8974e3-11f8-4820-ac5b-d70b337ecd4c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.358224 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrg44\" (UniqueName: \"kubernetes.io/projected/7ade98cb-8582-4066-b635-e837a190302d-kube-api-access-nrg44\") pod \"cluster-image-registry-operator-dc59b4c8b-6nsx4\" (UID: \"7ade98cb-8582-4066-b635-e837a190302d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6nsx4" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.358277 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdvvd\" (UniqueName: \"kubernetes.io/projected/7f436461-6865-411d-9c2d-8c5794d1b4ab-kube-api-access-gdvvd\") pod \"apiserver-7bbb656c7d-4xhjg\" (UID: \"7f436461-6865-411d-9c2d-8c5794d1b4ab\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.364780 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.370224 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj8vq\" (UniqueName: \"kubernetes.io/projected/ce050654-a42c-4472-9990-581502ae1830-kube-api-access-rj8vq\") pod \"catalog-operator-68c6474976-zmfcx\" (UID: \"ce050654-a42c-4472-9990-581502ae1830\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zmfcx" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.377324 4952 request.go:700] Waited for 1.940846457s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.395691 4952 generic.go:334] "Generic (PLEG): container finished" podID="b72acca1-8338-4e9c-9fa5-55616766e8a9" containerID="459772c543622b64483423fef331cf56a9c590f00c49477998391bfecccc7ca9" exitCode=0 Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.395814 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" event={"ID":"b72acca1-8338-4e9c-9fa5-55616766e8a9","Type":"ContainerDied","Data":"459772c543622b64483423fef331cf56a9c590f00c49477998391bfecccc7ca9"} Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.395849 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" event={"ID":"b72acca1-8338-4e9c-9fa5-55616766e8a9","Type":"ContainerStarted","Data":"d06c667babe13b4cb65fe0b625695d3a2e102e9acd1edb7168a6ea0ba68dfc74"} Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.396973 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ade98cb-8582-4066-b635-e837a190302d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6nsx4\" (UID: \"7ade98cb-8582-4066-b635-e837a190302d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6nsx4" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.397354 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.414695 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxrrj\" (UniqueName: \"kubernetes.io/projected/ef466a9b-34cc-4282-ad35-96731b58b8c3-kube-api-access-bxrrj\") pod \"machine-approver-56656f9798-8qgqp\" (UID: \"ef466a9b-34cc-4282-ad35-96731b58b8c3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8qgqp" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.433397 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsj25\" (UniqueName: \"kubernetes.io/projected/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-kube-api-access-vsj25\") pod \"oauth-openshift-558db77b4-vpkgq\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.435151 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.450784 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7nm8\" (UniqueName: \"kubernetes.io/projected/af11e3ed-3c58-4ad5-9da7-38b9950ff726-kube-api-access-f7nm8\") pod \"downloads-7954f5f757-kkdb8\" (UID: \"af11e3ed-3c58-4ad5-9da7-38b9950ff726\") " pod="openshift-console/downloads-7954f5f757-kkdb8" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.471343 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr5xv\" (UniqueName: \"kubernetes.io/projected/534f7f1e-7321-49f1-8d68-7b356c28b8ac-kube-api-access-rr5xv\") pod \"openshift-apiserver-operator-796bbdcf4f-wwch7\" (UID: \"534f7f1e-7321-49f1-8d68-7b356c28b8ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wwch7" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.491212 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6flq\" (UniqueName: \"kubernetes.io/projected/e043e178-e7e5-4ddf-b561-7253433d6e81-kube-api-access-t6flq\") pod \"machine-api-operator-5694c8668f-mz7ld\" (UID: \"e043e178-e7e5-4ddf-b561-7253433d6e81\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mz7ld" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.503044 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-kkh7f" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.506169 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hcvgc" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.509694 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-864dz\" (UniqueName: \"kubernetes.io/projected/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-kube-api-access-864dz\") pod \"route-controller-manager-6576b87f9c-fgnfx\" (UID: \"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.515051 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.521533 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.531176 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpd97\" (UniqueName: \"kubernetes.io/projected/8a0cf8dd-fc29-442b-9ff2-7360946df755-kube-api-access-fpd97\") pod \"openshift-controller-manager-operator-756b6f6bc6-sjc2j\" (UID: \"8a0cf8dd-fc29-442b-9ff2-7360946df755\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sjc2j" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.533721 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.535049 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zmfcx" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.536894 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wwch7" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.549126 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vg8j9"] Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.553488 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.574859 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.584187 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-kkdb8" Nov 22 02:56:18 crc kubenswrapper[4952]: W1122 02:56:18.590854 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f6bb21_bb06_4b91_a1ff_15b596f1f92f.slice/crio-c3930aebe78687314949ee16e3fc902a8ef9b1f5eebf6c1bc5f954b2a3828016 WatchSource:0}: Error finding container c3930aebe78687314949ee16e3fc902a8ef9b1f5eebf6c1bc5f954b2a3828016: Status 404 returned error can't find the container with id c3930aebe78687314949ee16e3fc902a8ef9b1f5eebf6c1bc5f954b2a3828016 Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.599605 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6nsx4" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.601033 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z6kn5"] Nov 22 02:56:18 crc kubenswrapper[4952]: W1122 02:56:18.628252 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63bdcada_d1ad_45eb_b290_42b2b8dd8257.slice/crio-5cff258cd5e1536d199a9b38da4c7e36d6cb69dbfd7ad6447d6e7b2d3524c968 WatchSource:0}: Error finding container 5cff258cd5e1536d199a9b38da4c7e36d6cb69dbfd7ad6447d6e7b2d3524c968: Status 404 returned error can't find the container with id 5cff258cd5e1536d199a9b38da4c7e36d6cb69dbfd7ad6447d6e7b2d3524c968 Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.631221 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b46a19f6-7d04-44b1-a2ad-6146c66fb5e2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7vg2x\" (UID: \"b46a19f6-7d04-44b1-a2ad-6146c66fb5e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.639434 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs5ml\" (UniqueName: \"kubernetes.io/projected/9aae38d4-efc5-4f2b-acdf-0d3a607b54a9-kube-api-access-bs5ml\") pod \"multus-admission-controller-857f4d67dd-5gwx5\" (UID: \"9aae38d4-efc5-4f2b-acdf-0d3a607b54a9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5gwx5" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.646690 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8qgqp" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.653652 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ztvq\" (UniqueName: \"kubernetes.io/projected/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-kube-api-access-5ztvq\") pod \"console-f9d7485db-89rxq\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.685774 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh6jb\" (UniqueName: \"kubernetes.io/projected/85eed36b-26fa-4c39-a899-23262c4c1043-kube-api-access-nh6jb\") pod \"machine-config-operator-74547568cd-nbpml\" (UID: \"85eed36b-26fa-4c39-a899-23262c4c1043\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.703165 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw2bs\" (UniqueName: \"kubernetes.io/projected/c8b36a8f-760f-47c0-a090-c1f8c8ac44c5-kube-api-access-xw2bs\") pod \"control-plane-machine-set-operator-78cbb6b69f-k75vw\" (UID: \"c8b36a8f-760f-47c0-a090-c1f8c8ac44c5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k75vw" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.721840 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mz7ld" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.723461 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb9f7\" (UniqueName: \"kubernetes.io/projected/7c7c8cee-9a36-4dad-9a04-3c30fe4a7bed-kube-api-access-pb9f7\") pod \"service-ca-9c57cc56f-2ggzk\" (UID: \"7c7c8cee-9a36-4dad-9a04-3c30fe4a7bed\") " pod="openshift-service-ca/service-ca-9c57cc56f-2ggzk" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.724774 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg"] Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.735366 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9kjf\" (UniqueName: \"kubernetes.io/projected/ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb-kube-api-access-r9kjf\") pod \"collect-profiles-29396325-swzgq\" (UID: \"ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.750268 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqsch\" (UniqueName: \"kubernetes.io/projected/12981135-5b52-464d-8690-e571eb306507-kube-api-access-qqsch\") pod \"router-default-5444994796-9wz27\" (UID: \"12981135-5b52-464d-8690-e571eb306507\") " pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.755631 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vpkgq"] Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.766736 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc27v\" (UniqueName: \"kubernetes.io/projected/b46a19f6-7d04-44b1-a2ad-6146c66fb5e2-kube-api-access-lc27v\") pod \"ingress-operator-5b745b69d9-7vg2x\" (UID: \"b46a19f6-7d04-44b1-a2ad-6146c66fb5e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.771091 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sjc2j" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.788659 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgswf\" (UniqueName: \"kubernetes.io/projected/7f15cd27-e587-4db6-8fcd-b5b2cd559656-kube-api-access-fgswf\") pod \"package-server-manager-789f6589d5-sw6q8\" (UID: \"7f15cd27-e587-4db6-8fcd-b5b2cd559656\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sw6q8" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.795856 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k75vw" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.813811 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aee2ceff-d701-49e2-9dca-8edb0bd1d59e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m5hc7\" (UID: \"aee2ceff-d701-49e2-9dca-8edb0bd1d59e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m5hc7" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.847896 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcc45\" (UniqueName: \"kubernetes.io/projected/2f5ae1d2-b361-49e2-9460-b447f70a4cd3-kube-api-access-tcc45\") pod \"etcd-operator-b45778765-65gk2\" (UID: \"2f5ae1d2-b361-49e2-9460-b447f70a4cd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.901895 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d26d8bc9-7ecf-4bb4-9a6b-b815722b0e72-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7kdk4\" (UID: \"d26d8bc9-7ecf-4bb4-9a6b-b815722b0e72\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kdk4" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.901990 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec0d0a12-4c75-4177-9e76-26baecebbf14-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rlp29\" (UID: \"ec0d0a12-4c75-4177-9e76-26baecebbf14\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rlp29" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.902040 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6630a2b1-f78c-4c1e-8511-6bb8dc615362-srv-cert\") pod \"olm-operator-6b444d44fb-2c9gx\" (UID: \"6630a2b1-f78c-4c1e-8511-6bb8dc615362\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2c9gx" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.902123 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fhwp\" (UniqueName: \"kubernetes.io/projected/a420a904-6165-4f7a-a29e-3c5549e5cec5-kube-api-access-4fhwp\") pod \"machine-config-controller-84d6567774-8ngds\" (UID: \"a420a904-6165-4f7a-a29e-3c5549e5cec5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8ngds" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.902166 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx6gk\" (UniqueName: \"kubernetes.io/projected/ec0d0a12-4c75-4177-9e76-26baecebbf14-kube-api-access-cx6gk\") pod \"kube-storage-version-migrator-operator-b67b599dd-rlp29\" (UID: \"ec0d0a12-4c75-4177-9e76-26baecebbf14\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rlp29" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.902239 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0893cff8-0528-4b35-b1f9-faa91e42e5a5-metrics-tls\") pod \"dns-operator-744455d44c-t4rf8\" (UID: \"0893cff8-0528-4b35-b1f9-faa91e42e5a5\") " pod="openshift-dns-operator/dns-operator-744455d44c-t4rf8" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.902314 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d26d8bc9-7ecf-4bb4-9a6b-b815722b0e72-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7kdk4\" (UID: \"d26d8bc9-7ecf-4bb4-9a6b-b815722b0e72\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kdk4" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.902381 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a420a904-6165-4f7a-a29e-3c5549e5cec5-proxy-tls\") pod \"machine-config-controller-84d6567774-8ngds\" (UID: \"a420a904-6165-4f7a-a29e-3c5549e5cec5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8ngds" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.902410 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c07437b-bd0d-4569-b4a4-ff08b56f4a23-webhook-cert\") pod \"packageserver-d55dfcdfc-fch9k\" (UID: \"4c07437b-bd0d-4569-b4a4-ff08b56f4a23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.902488 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/073b4a27-3e98-4d0d-a2b7-62a89a434907-bound-sa-token\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.902675 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0449fcf2-652a-4be5-957e-ecf47ef86668-config\") pod \"kube-controller-manager-operator-78b949d7b-jgbvn\" (UID: \"0449fcf2-652a-4be5-957e-ecf47ef86668\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jgbvn" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.902747 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4c07437b-bd0d-4569-b4a4-ff08b56f4a23-tmpfs\") pod \"packageserver-d55dfcdfc-fch9k\" (UID: \"4c07437b-bd0d-4569-b4a4-ff08b56f4a23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.902816 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/073b4a27-3e98-4d0d-a2b7-62a89a434907-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.902903 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f481093-b909-4014-a103-2a655003e144-config\") pod \"service-ca-operator-777779d784-9jtzv\" (UID: \"6f481093-b909-4014-a103-2a655003e144\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9jtzv" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903030 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj8f5\" (UniqueName: \"kubernetes.io/projected/85b3681c-313d-40d1-b1f9-c8410c81dc20-kube-api-access-qj8f5\") pod \"marketplace-operator-79b997595-9g9km\" (UID: \"85b3681c-313d-40d1-b1f9-c8410c81dc20\") " pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903118 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z75zc\" (UniqueName: \"kubernetes.io/projected/0893cff8-0528-4b35-b1f9-faa91e42e5a5-kube-api-access-z75zc\") pod \"dns-operator-744455d44c-t4rf8\" (UID: \"0893cff8-0528-4b35-b1f9-faa91e42e5a5\") " pod="openshift-dns-operator/dns-operator-744455d44c-t4rf8" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903172 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f481093-b909-4014-a103-2a655003e144-serving-cert\") pod \"service-ca-operator-777779d784-9jtzv\" (UID: \"6f481093-b909-4014-a103-2a655003e144\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9jtzv" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903201 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5zv2\" (UniqueName: \"kubernetes.io/projected/6630a2b1-f78c-4c1e-8511-6bb8dc615362-kube-api-access-z5zv2\") pod \"olm-operator-6b444d44fb-2c9gx\" (UID: \"6630a2b1-f78c-4c1e-8511-6bb8dc615362\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2c9gx" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903225 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/073b4a27-3e98-4d0d-a2b7-62a89a434907-registry-tls\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903247 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a420a904-6165-4f7a-a29e-3c5549e5cec5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8ngds\" (UID: \"a420a904-6165-4f7a-a29e-3c5549e5cec5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8ngds" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903283 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8zdf\" (UniqueName: \"kubernetes.io/projected/a57045cb-1c86-432e-ae6f-2f973ce52596-kube-api-access-j8zdf\") pod \"migrator-59844c95c7-8j98h\" (UID: \"a57045cb-1c86-432e-ae6f-2f973ce52596\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8j98h" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903355 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rglbf\" (UniqueName: \"kubernetes.io/projected/4c07437b-bd0d-4569-b4a4-ff08b56f4a23-kube-api-access-rglbf\") pod \"packageserver-d55dfcdfc-fch9k\" (UID: \"4c07437b-bd0d-4569-b4a4-ff08b56f4a23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903419 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0449fcf2-652a-4be5-957e-ecf47ef86668-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jgbvn\" (UID: \"0449fcf2-652a-4be5-957e-ecf47ef86668\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jgbvn" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903443 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85b3681c-313d-40d1-b1f9-c8410c81dc20-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9g9km\" (UID: \"85b3681c-313d-40d1-b1f9-c8410c81dc20\") " pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903505 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26d8bc9-7ecf-4bb4-9a6b-b815722b0e72-config\") pod \"kube-apiserver-operator-766d6c64bb-7kdk4\" (UID: \"d26d8bc9-7ecf-4bb4-9a6b-b815722b0e72\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kdk4" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903528 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/85b3681c-313d-40d1-b1f9-c8410c81dc20-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9g9km\" (UID: \"85b3681c-313d-40d1-b1f9-c8410c81dc20\") " pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903596 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0449fcf2-652a-4be5-957e-ecf47ef86668-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jgbvn\" (UID: \"0449fcf2-652a-4be5-957e-ecf47ef86668\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jgbvn" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903619 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/073b4a27-3e98-4d0d-a2b7-62a89a434907-registry-certificates\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903690 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/073b4a27-3e98-4d0d-a2b7-62a89a434907-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903726 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c07437b-bd0d-4569-b4a4-ff08b56f4a23-apiservice-cert\") pod \"packageserver-d55dfcdfc-fch9k\" (UID: \"4c07437b-bd0d-4569-b4a4-ff08b56f4a23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903747 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec0d0a12-4c75-4177-9e76-26baecebbf14-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rlp29\" (UID: \"ec0d0a12-4c75-4177-9e76-26baecebbf14\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rlp29" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903769 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6630a2b1-f78c-4c1e-8511-6bb8dc615362-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2c9gx\" (UID: \"6630a2b1-f78c-4c1e-8511-6bb8dc615362\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2c9gx" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903834 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903860 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt8hx\" (UniqueName: \"kubernetes.io/projected/6f481093-b909-4014-a103-2a655003e144-kube-api-access-tt8hx\") pod \"service-ca-operator-777779d784-9jtzv\" (UID: \"6f481093-b909-4014-a103-2a655003e144\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9jtzv" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903911 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/073b4a27-3e98-4d0d-a2b7-62a89a434907-trusted-ca\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.903973 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-594zj\" (UniqueName: \"kubernetes.io/projected/073b4a27-3e98-4d0d-a2b7-62a89a434907-kube-api-access-594zj\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:18 crc kubenswrapper[4952]: E1122 02:56:18.906116 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:19.406094247 +0000 UTC m=+143.712111600 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.915624 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.917449 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.939706 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-5gwx5" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.953531 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.954440 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sw6q8" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.975952 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-2ggzk" Nov 22 02:56:18 crc kubenswrapper[4952]: I1122 02:56:18.997721 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.005459 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:19 crc kubenswrapper[4952]: E1122 02:56:19.005671 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:19.505633211 +0000 UTC m=+143.811650484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.005760 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0449fcf2-652a-4be5-957e-ecf47ef86668-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jgbvn\" (UID: \"0449fcf2-652a-4be5-957e-ecf47ef86668\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jgbvn" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.005824 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85b3681c-313d-40d1-b1f9-c8410c81dc20-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9g9km\" (UID: \"85b3681c-313d-40d1-b1f9-c8410c81dc20\") " pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.005853 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/028b56f2-cfad-4db5-81ab-fa866f42f9c3-socket-dir\") pod \"csi-hostpathplugin-94k47\" (UID: \"028b56f2-cfad-4db5-81ab-fa866f42f9c3\") " pod="hostpath-provisioner/csi-hostpathplugin-94k47" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.005870 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbbwz\" (UniqueName: \"kubernetes.io/projected/028b56f2-cfad-4db5-81ab-fa866f42f9c3-kube-api-access-nbbwz\") pod \"csi-hostpathplugin-94k47\" (UID: \"028b56f2-cfad-4db5-81ab-fa866f42f9c3\") " pod="hostpath-provisioner/csi-hostpathplugin-94k47" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.005906 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26d8bc9-7ecf-4bb4-9a6b-b815722b0e72-config\") pod \"kube-apiserver-operator-766d6c64bb-7kdk4\" (UID: \"d26d8bc9-7ecf-4bb4-9a6b-b815722b0e72\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kdk4" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.005922 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/85b3681c-313d-40d1-b1f9-c8410c81dc20-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9g9km\" (UID: \"85b3681c-313d-40d1-b1f9-c8410c81dc20\") " pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.005954 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf6ht\" (UniqueName: \"kubernetes.io/projected/c1faf425-0d47-41a5-8727-7f8f5d74f8b9-kube-api-access-vf6ht\") pod \"ingress-canary-m2hvn\" (UID: \"c1faf425-0d47-41a5-8727-7f8f5d74f8b9\") " pod="openshift-ingress-canary/ingress-canary-m2hvn" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.005973 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/028b56f2-cfad-4db5-81ab-fa866f42f9c3-csi-data-dir\") pod \"csi-hostpathplugin-94k47\" (UID: \"028b56f2-cfad-4db5-81ab-fa866f42f9c3\") " pod="hostpath-provisioner/csi-hostpathplugin-94k47" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.005998 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0449fcf2-652a-4be5-957e-ecf47ef86668-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jgbvn\" (UID: \"0449fcf2-652a-4be5-957e-ecf47ef86668\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jgbvn" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006061 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/073b4a27-3e98-4d0d-a2b7-62a89a434907-registry-certificates\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006090 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/073b4a27-3e98-4d0d-a2b7-62a89a434907-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006108 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/867a3e20-6f2f-4dfe-a378-e6357d6c19e3-config-volume\") pod \"dns-default-fhpzk\" (UID: \"867a3e20-6f2f-4dfe-a378-e6357d6c19e3\") " pod="openshift-dns/dns-default-fhpzk" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006146 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c07437b-bd0d-4569-b4a4-ff08b56f4a23-apiservice-cert\") pod \"packageserver-d55dfcdfc-fch9k\" (UID: \"4c07437b-bd0d-4569-b4a4-ff08b56f4a23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006166 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec0d0a12-4c75-4177-9e76-26baecebbf14-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rlp29\" (UID: \"ec0d0a12-4c75-4177-9e76-26baecebbf14\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rlp29" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006190 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6630a2b1-f78c-4c1e-8511-6bb8dc615362-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2c9gx\" (UID: \"6630a2b1-f78c-4c1e-8511-6bb8dc615362\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2c9gx" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006226 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/028b56f2-cfad-4db5-81ab-fa866f42f9c3-mountpoint-dir\") pod \"csi-hostpathplugin-94k47\" (UID: \"028b56f2-cfad-4db5-81ab-fa866f42f9c3\") " pod="hostpath-provisioner/csi-hostpathplugin-94k47" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006278 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006297 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt8hx\" (UniqueName: \"kubernetes.io/projected/6f481093-b909-4014-a103-2a655003e144-kube-api-access-tt8hx\") pod \"service-ca-operator-777779d784-9jtzv\" (UID: \"6f481093-b909-4014-a103-2a655003e144\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9jtzv" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006338 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/028b56f2-cfad-4db5-81ab-fa866f42f9c3-registration-dir\") pod \"csi-hostpathplugin-94k47\" (UID: \"028b56f2-cfad-4db5-81ab-fa866f42f9c3\") " pod="hostpath-provisioner/csi-hostpathplugin-94k47" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006366 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/073b4a27-3e98-4d0d-a2b7-62a89a434907-trusted-ca\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006392 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-594zj\" (UniqueName: \"kubernetes.io/projected/073b4a27-3e98-4d0d-a2b7-62a89a434907-kube-api-access-594zj\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006493 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d26d8bc9-7ecf-4bb4-9a6b-b815722b0e72-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7kdk4\" (UID: \"d26d8bc9-7ecf-4bb4-9a6b-b815722b0e72\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kdk4" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006583 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec0d0a12-4c75-4177-9e76-26baecebbf14-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rlp29\" (UID: \"ec0d0a12-4c75-4177-9e76-26baecebbf14\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rlp29" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006628 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6630a2b1-f78c-4c1e-8511-6bb8dc615362-srv-cert\") pod \"olm-operator-6b444d44fb-2c9gx\" (UID: \"6630a2b1-f78c-4c1e-8511-6bb8dc615362\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2c9gx" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006658 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fhwp\" (UniqueName: \"kubernetes.io/projected/a420a904-6165-4f7a-a29e-3c5549e5cec5-kube-api-access-4fhwp\") pod \"machine-config-controller-84d6567774-8ngds\" (UID: \"a420a904-6165-4f7a-a29e-3c5549e5cec5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8ngds" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006685 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx6gk\" (UniqueName: \"kubernetes.io/projected/ec0d0a12-4c75-4177-9e76-26baecebbf14-kube-api-access-cx6gk\") pod \"kube-storage-version-migrator-operator-b67b599dd-rlp29\" (UID: \"ec0d0a12-4c75-4177-9e76-26baecebbf14\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rlp29" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006703 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/40f15d13-2eba-4271-bfd6-6c4b32f77ea2-node-bootstrap-token\") pod \"machine-config-server-hh2wf\" (UID: \"40f15d13-2eba-4271-bfd6-6c4b32f77ea2\") " pod="openshift-machine-config-operator/machine-config-server-hh2wf" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006747 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0893cff8-0528-4b35-b1f9-faa91e42e5a5-metrics-tls\") pod \"dns-operator-744455d44c-t4rf8\" (UID: \"0893cff8-0528-4b35-b1f9-faa91e42e5a5\") " pod="openshift-dns-operator/dns-operator-744455d44c-t4rf8" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006768 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1faf425-0d47-41a5-8727-7f8f5d74f8b9-cert\") pod \"ingress-canary-m2hvn\" (UID: \"c1faf425-0d47-41a5-8727-7f8f5d74f8b9\") " pod="openshift-ingress-canary/ingress-canary-m2hvn" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006816 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d26d8bc9-7ecf-4bb4-9a6b-b815722b0e72-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7kdk4\" (UID: \"d26d8bc9-7ecf-4bb4-9a6b-b815722b0e72\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kdk4" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006844 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a420a904-6165-4f7a-a29e-3c5549e5cec5-proxy-tls\") pod \"machine-config-controller-84d6567774-8ngds\" (UID: \"a420a904-6165-4f7a-a29e-3c5549e5cec5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8ngds" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006862 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c07437b-bd0d-4569-b4a4-ff08b56f4a23-webhook-cert\") pod \"packageserver-d55dfcdfc-fch9k\" (UID: \"4c07437b-bd0d-4569-b4a4-ff08b56f4a23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006878 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqb52\" (UniqueName: \"kubernetes.io/projected/40f15d13-2eba-4271-bfd6-6c4b32f77ea2-kube-api-access-lqb52\") pod \"machine-config-server-hh2wf\" (UID: \"40f15d13-2eba-4271-bfd6-6c4b32f77ea2\") " pod="openshift-machine-config-operator/machine-config-server-hh2wf" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.006911 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/073b4a27-3e98-4d0d-a2b7-62a89a434907-bound-sa-token\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:19 crc kubenswrapper[4952]: E1122 02:56:19.008037 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:19.508024252 +0000 UTC m=+143.814041525 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.008650 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/073b4a27-3e98-4d0d-a2b7-62a89a434907-trusted-ca\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.009452 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85b3681c-313d-40d1-b1f9-c8410c81dc20-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9g9km\" (UID: \"85b3681c-313d-40d1-b1f9-c8410c81dc20\") " pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.009751 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.014757 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26d8bc9-7ecf-4bb4-9a6b-b815722b0e72-config\") pod \"kube-apiserver-operator-766d6c64bb-7kdk4\" (UID: \"d26d8bc9-7ecf-4bb4-9a6b-b815722b0e72\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kdk4" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.015000 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/073b4a27-3e98-4d0d-a2b7-62a89a434907-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.015312 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/85b3681c-313d-40d1-b1f9-c8410c81dc20-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9g9km\" (UID: \"85b3681c-313d-40d1-b1f9-c8410c81dc20\") " pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.015927 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/073b4a27-3e98-4d0d-a2b7-62a89a434907-registry-certificates\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.016006 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/028b56f2-cfad-4db5-81ab-fa866f42f9c3-plugins-dir\") pod \"csi-hostpathplugin-94k47\" (UID: \"028b56f2-cfad-4db5-81ab-fa866f42f9c3\") " pod="hostpath-provisioner/csi-hostpathplugin-94k47" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.016249 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0449fcf2-652a-4be5-957e-ecf47ef86668-config\") pod \"kube-controller-manager-operator-78b949d7b-jgbvn\" (UID: \"0449fcf2-652a-4be5-957e-ecf47ef86668\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jgbvn" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.016677 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec0d0a12-4c75-4177-9e76-26baecebbf14-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rlp29\" (UID: \"ec0d0a12-4c75-4177-9e76-26baecebbf14\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rlp29" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.017461 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/867a3e20-6f2f-4dfe-a378-e6357d6c19e3-metrics-tls\") pod \"dns-default-fhpzk\" (UID: \"867a3e20-6f2f-4dfe-a378-e6357d6c19e3\") " pod="openshift-dns/dns-default-fhpzk" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.017504 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4c07437b-bd0d-4569-b4a4-ff08b56f4a23-tmpfs\") pod \"packageserver-d55dfcdfc-fch9k\" (UID: \"4c07437b-bd0d-4569-b4a4-ff08b56f4a23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.017564 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/073b4a27-3e98-4d0d-a2b7-62a89a434907-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.017639 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f481093-b909-4014-a103-2a655003e144-config\") pod \"service-ca-operator-777779d784-9jtzv\" (UID: \"6f481093-b909-4014-a103-2a655003e144\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9jtzv" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.018245 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj8f5\" (UniqueName: \"kubernetes.io/projected/85b3681c-313d-40d1-b1f9-c8410c81dc20-kube-api-access-qj8f5\") pod \"marketplace-operator-79b997595-9g9km\" (UID: \"85b3681c-313d-40d1-b1f9-c8410c81dc20\") " pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.018434 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z75zc\" (UniqueName: \"kubernetes.io/projected/0893cff8-0528-4b35-b1f9-faa91e42e5a5-kube-api-access-z75zc\") pod \"dns-operator-744455d44c-t4rf8\" (UID: \"0893cff8-0528-4b35-b1f9-faa91e42e5a5\") " pod="openshift-dns-operator/dns-operator-744455d44c-t4rf8" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.018457 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f481093-b909-4014-a103-2a655003e144-serving-cert\") pod \"service-ca-operator-777779d784-9jtzv\" (UID: \"6f481093-b909-4014-a103-2a655003e144\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9jtzv" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.018470 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f481093-b909-4014-a103-2a655003e144-config\") pod \"service-ca-operator-777779d784-9jtzv\" (UID: \"6f481093-b909-4014-a103-2a655003e144\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9jtzv" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.018475 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5zv2\" (UniqueName: \"kubernetes.io/projected/6630a2b1-f78c-4c1e-8511-6bb8dc615362-kube-api-access-z5zv2\") pod \"olm-operator-6b444d44fb-2c9gx\" (UID: \"6630a2b1-f78c-4c1e-8511-6bb8dc615362\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2c9gx" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.018518 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/073b4a27-3e98-4d0d-a2b7-62a89a434907-registry-tls\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.018586 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0449fcf2-652a-4be5-957e-ecf47ef86668-config\") pod \"kube-controller-manager-operator-78b949d7b-jgbvn\" (UID: \"0449fcf2-652a-4be5-957e-ecf47ef86668\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jgbvn" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.019287 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a420a904-6165-4f7a-a29e-3c5549e5cec5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8ngds\" (UID: \"a420a904-6165-4f7a-a29e-3c5549e5cec5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8ngds" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.019344 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v8tw\" (UniqueName: \"kubernetes.io/projected/867a3e20-6f2f-4dfe-a378-e6357d6c19e3-kube-api-access-7v8tw\") pod \"dns-default-fhpzk\" (UID: \"867a3e20-6f2f-4dfe-a378-e6357d6c19e3\") " pod="openshift-dns/dns-default-fhpzk" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.019420 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4c07437b-bd0d-4569-b4a4-ff08b56f4a23-tmpfs\") pod \"packageserver-d55dfcdfc-fch9k\" (UID: \"4c07437b-bd0d-4569-b4a4-ff08b56f4a23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.019642 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8zdf\" (UniqueName: \"kubernetes.io/projected/a57045cb-1c86-432e-ae6f-2f973ce52596-kube-api-access-j8zdf\") pod \"migrator-59844c95c7-8j98h\" (UID: \"a57045cb-1c86-432e-ae6f-2f973ce52596\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8j98h" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.019979 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a420a904-6165-4f7a-a29e-3c5549e5cec5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8ngds\" (UID: \"a420a904-6165-4f7a-a29e-3c5549e5cec5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8ngds" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.020027 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6630a2b1-f78c-4c1e-8511-6bb8dc615362-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2c9gx\" (UID: \"6630a2b1-f78c-4c1e-8511-6bb8dc615362\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2c9gx" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.020239 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/40f15d13-2eba-4271-bfd6-6c4b32f77ea2-certs\") pod \"machine-config-server-hh2wf\" (UID: \"40f15d13-2eba-4271-bfd6-6c4b32f77ea2\") " pod="openshift-machine-config-operator/machine-config-server-hh2wf" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.020468 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rglbf\" (UniqueName: \"kubernetes.io/projected/4c07437b-bd0d-4569-b4a4-ff08b56f4a23-kube-api-access-rglbf\") pod \"packageserver-d55dfcdfc-fch9k\" (UID: \"4c07437b-bd0d-4569-b4a4-ff08b56f4a23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.020926 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a420a904-6165-4f7a-a29e-3c5549e5cec5-proxy-tls\") pod \"machine-config-controller-84d6567774-8ngds\" (UID: \"a420a904-6165-4f7a-a29e-3c5549e5cec5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8ngds" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.021860 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0449fcf2-652a-4be5-957e-ecf47ef86668-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jgbvn\" (UID: \"0449fcf2-652a-4be5-957e-ecf47ef86668\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jgbvn" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.022000 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6630a2b1-f78c-4c1e-8511-6bb8dc615362-srv-cert\") pod \"olm-operator-6b444d44fb-2c9gx\" (UID: \"6630a2b1-f78c-4c1e-8511-6bb8dc615362\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2c9gx" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.024193 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c07437b-bd0d-4569-b4a4-ff08b56f4a23-webhook-cert\") pod \"packageserver-d55dfcdfc-fch9k\" (UID: \"4c07437b-bd0d-4569-b4a4-ff08b56f4a23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.024961 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0893cff8-0528-4b35-b1f9-faa91e42e5a5-metrics-tls\") pod \"dns-operator-744455d44c-t4rf8\" (UID: \"0893cff8-0528-4b35-b1f9-faa91e42e5a5\") " pod="openshift-dns-operator/dns-operator-744455d44c-t4rf8" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.025203 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d26d8bc9-7ecf-4bb4-9a6b-b815722b0e72-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7kdk4\" (UID: \"d26d8bc9-7ecf-4bb4-9a6b-b815722b0e72\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kdk4" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.026975 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/073b4a27-3e98-4d0d-a2b7-62a89a434907-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.027323 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c07437b-bd0d-4569-b4a4-ff08b56f4a23-apiservice-cert\") pod \"packageserver-d55dfcdfc-fch9k\" (UID: \"4c07437b-bd0d-4569-b4a4-ff08b56f4a23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.029558 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec0d0a12-4c75-4177-9e76-26baecebbf14-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rlp29\" (UID: \"ec0d0a12-4c75-4177-9e76-26baecebbf14\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rlp29" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.029848 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f481093-b909-4014-a103-2a655003e144-serving-cert\") pod \"service-ca-operator-777779d784-9jtzv\" (UID: \"6f481093-b909-4014-a103-2a655003e144\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9jtzv" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.029943 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/073b4a27-3e98-4d0d-a2b7-62a89a434907-registry-tls\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.030056 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt8hx\" (UniqueName: \"kubernetes.io/projected/6f481093-b909-4014-a103-2a655003e144-kube-api-access-tt8hx\") pod \"service-ca-operator-777779d784-9jtzv\" (UID: \"6f481093-b909-4014-a103-2a655003e144\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9jtzv" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.037764 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.045587 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m5hc7" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.060453 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-594zj\" (UniqueName: \"kubernetes.io/projected/073b4a27-3e98-4d0d-a2b7-62a89a434907-kube-api-access-594zj\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.081735 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d26d8bc9-7ecf-4bb4-9a6b-b815722b0e72-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7kdk4\" (UID: \"d26d8bc9-7ecf-4bb4-9a6b-b815722b0e72\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kdk4" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.100795 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx"] Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.103776 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kkh7f"] Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.115369 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9jtzv" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.116708 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5"] Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.122891 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:19 crc kubenswrapper[4952]: E1122 02:56:19.122994 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:19.622977742 +0000 UTC m=+143.928995005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.123032 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0449fcf2-652a-4be5-957e-ecf47ef86668-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jgbvn\" (UID: \"0449fcf2-652a-4be5-957e-ecf47ef86668\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jgbvn" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.123249 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.123284 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/028b56f2-cfad-4db5-81ab-fa866f42f9c3-registration-dir\") pod \"csi-hostpathplugin-94k47\" (UID: \"028b56f2-cfad-4db5-81ab-fa866f42f9c3\") " pod="hostpath-provisioner/csi-hostpathplugin-94k47" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.123339 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/40f15d13-2eba-4271-bfd6-6c4b32f77ea2-node-bootstrap-token\") pod \"machine-config-server-hh2wf\" (UID: \"40f15d13-2eba-4271-bfd6-6c4b32f77ea2\") " pod="openshift-machine-config-operator/machine-config-server-hh2wf" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.123358 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1faf425-0d47-41a5-8727-7f8f5d74f8b9-cert\") pod \"ingress-canary-m2hvn\" (UID: \"c1faf425-0d47-41a5-8727-7f8f5d74f8b9\") " pod="openshift-ingress-canary/ingress-canary-m2hvn" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.123377 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqb52\" (UniqueName: \"kubernetes.io/projected/40f15d13-2eba-4271-bfd6-6c4b32f77ea2-kube-api-access-lqb52\") pod \"machine-config-server-hh2wf\" (UID: \"40f15d13-2eba-4271-bfd6-6c4b32f77ea2\") " pod="openshift-machine-config-operator/machine-config-server-hh2wf" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.123400 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/028b56f2-cfad-4db5-81ab-fa866f42f9c3-plugins-dir\") pod \"csi-hostpathplugin-94k47\" (UID: \"028b56f2-cfad-4db5-81ab-fa866f42f9c3\") " pod="hostpath-provisioner/csi-hostpathplugin-94k47" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.123418 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/867a3e20-6f2f-4dfe-a378-e6357d6c19e3-metrics-tls\") pod \"dns-default-fhpzk\" (UID: \"867a3e20-6f2f-4dfe-a378-e6357d6c19e3\") " pod="openshift-dns/dns-default-fhpzk" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.123474 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v8tw\" (UniqueName: \"kubernetes.io/projected/867a3e20-6f2f-4dfe-a378-e6357d6c19e3-kube-api-access-7v8tw\") pod \"dns-default-fhpzk\" (UID: \"867a3e20-6f2f-4dfe-a378-e6357d6c19e3\") " pod="openshift-dns/dns-default-fhpzk" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.123499 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/40f15d13-2eba-4271-bfd6-6c4b32f77ea2-certs\") pod \"machine-config-server-hh2wf\" (UID: \"40f15d13-2eba-4271-bfd6-6c4b32f77ea2\") " pod="openshift-machine-config-operator/machine-config-server-hh2wf" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.123531 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/028b56f2-cfad-4db5-81ab-fa866f42f9c3-socket-dir\") pod \"csi-hostpathplugin-94k47\" (UID: \"028b56f2-cfad-4db5-81ab-fa866f42f9c3\") " pod="hostpath-provisioner/csi-hostpathplugin-94k47" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.123560 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbbwz\" (UniqueName: \"kubernetes.io/projected/028b56f2-cfad-4db5-81ab-fa866f42f9c3-kube-api-access-nbbwz\") pod \"csi-hostpathplugin-94k47\" (UID: \"028b56f2-cfad-4db5-81ab-fa866f42f9c3\") " pod="hostpath-provisioner/csi-hostpathplugin-94k47" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.123578 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf6ht\" (UniqueName: \"kubernetes.io/projected/c1faf425-0d47-41a5-8727-7f8f5d74f8b9-kube-api-access-vf6ht\") pod \"ingress-canary-m2hvn\" (UID: \"c1faf425-0d47-41a5-8727-7f8f5d74f8b9\") " pod="openshift-ingress-canary/ingress-canary-m2hvn" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.123592 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/028b56f2-cfad-4db5-81ab-fa866f42f9c3-csi-data-dir\") pod \"csi-hostpathplugin-94k47\" (UID: \"028b56f2-cfad-4db5-81ab-fa866f42f9c3\") " pod="hostpath-provisioner/csi-hostpathplugin-94k47" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.123615 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/867a3e20-6f2f-4dfe-a378-e6357d6c19e3-config-volume\") pod \"dns-default-fhpzk\" (UID: \"867a3e20-6f2f-4dfe-a378-e6357d6c19e3\") " pod="openshift-dns/dns-default-fhpzk" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.123636 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/028b56f2-cfad-4db5-81ab-fa866f42f9c3-mountpoint-dir\") pod \"csi-hostpathplugin-94k47\" (UID: \"028b56f2-cfad-4db5-81ab-fa866f42f9c3\") " pod="hostpath-provisioner/csi-hostpathplugin-94k47" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.123742 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/028b56f2-cfad-4db5-81ab-fa866f42f9c3-mountpoint-dir\") pod \"csi-hostpathplugin-94k47\" (UID: \"028b56f2-cfad-4db5-81ab-fa866f42f9c3\") " pod="hostpath-provisioner/csi-hostpathplugin-94k47" Nov 22 02:56:19 crc kubenswrapper[4952]: E1122 02:56:19.123813 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:19.623787822 +0000 UTC m=+143.929805095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.126495 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/028b56f2-cfad-4db5-81ab-fa866f42f9c3-registration-dir\") pod \"csi-hostpathplugin-94k47\" (UID: \"028b56f2-cfad-4db5-81ab-fa866f42f9c3\") " pod="hostpath-provisioner/csi-hostpathplugin-94k47" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.126585 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/028b56f2-cfad-4db5-81ab-fa866f42f9c3-plugins-dir\") pod \"csi-hostpathplugin-94k47\" (UID: \"028b56f2-cfad-4db5-81ab-fa866f42f9c3\") " pod="hostpath-provisioner/csi-hostpathplugin-94k47" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.127619 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/40f15d13-2eba-4271-bfd6-6c4b32f77ea2-node-bootstrap-token\") pod \"machine-config-server-hh2wf\" (UID: \"40f15d13-2eba-4271-bfd6-6c4b32f77ea2\") " pod="openshift-machine-config-operator/machine-config-server-hh2wf" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.127864 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/028b56f2-cfad-4db5-81ab-fa866f42f9c3-socket-dir\") pod \"csi-hostpathplugin-94k47\" (UID: \"028b56f2-cfad-4db5-81ab-fa866f42f9c3\") " pod="hostpath-provisioner/csi-hostpathplugin-94k47" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.128160 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/028b56f2-cfad-4db5-81ab-fa866f42f9c3-csi-data-dir\") pod \"csi-hostpathplugin-94k47\" (UID: \"028b56f2-cfad-4db5-81ab-fa866f42f9c3\") " pod="hostpath-provisioner/csi-hostpathplugin-94k47" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.128971 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/867a3e20-6f2f-4dfe-a378-e6357d6c19e3-config-volume\") pod \"dns-default-fhpzk\" (UID: \"867a3e20-6f2f-4dfe-a378-e6357d6c19e3\") " pod="openshift-dns/dns-default-fhpzk" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.132534 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1faf425-0d47-41a5-8727-7f8f5d74f8b9-cert\") pod \"ingress-canary-m2hvn\" (UID: \"c1faf425-0d47-41a5-8727-7f8f5d74f8b9\") " pod="openshift-ingress-canary/ingress-canary-m2hvn" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.132992 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/867a3e20-6f2f-4dfe-a378-e6357d6c19e3-metrics-tls\") pod \"dns-default-fhpzk\" (UID: \"867a3e20-6f2f-4dfe-a378-e6357d6c19e3\") " pod="openshift-dns/dns-default-fhpzk" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.133645 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/40f15d13-2eba-4271-bfd6-6c4b32f77ea2-certs\") pod \"machine-config-server-hh2wf\" (UID: \"40f15d13-2eba-4271-bfd6-6c4b32f77ea2\") " pod="openshift-machine-config-operator/machine-config-server-hh2wf" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.144005 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx6gk\" (UniqueName: \"kubernetes.io/projected/ec0d0a12-4c75-4177-9e76-26baecebbf14-kube-api-access-cx6gk\") pod \"kube-storage-version-migrator-operator-b67b599dd-rlp29\" (UID: \"ec0d0a12-4c75-4177-9e76-26baecebbf14\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rlp29" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.144815 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fhwp\" (UniqueName: \"kubernetes.io/projected/a420a904-6165-4f7a-a29e-3c5549e5cec5-kube-api-access-4fhwp\") pod \"machine-config-controller-84d6567774-8ngds\" (UID: \"a420a904-6165-4f7a-a29e-3c5549e5cec5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8ngds" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.186954 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5zv2\" (UniqueName: \"kubernetes.io/projected/6630a2b1-f78c-4c1e-8511-6bb8dc615362-kube-api-access-z5zv2\") pod \"olm-operator-6b444d44fb-2c9gx\" (UID: \"6630a2b1-f78c-4c1e-8511-6bb8dc615362\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2c9gx" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.212604 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z75zc\" (UniqueName: \"kubernetes.io/projected/0893cff8-0528-4b35-b1f9-faa91e42e5a5-kube-api-access-z75zc\") pod \"dns-operator-744455d44c-t4rf8\" (UID: \"0893cff8-0528-4b35-b1f9-faa91e42e5a5\") " pod="openshift-dns-operator/dns-operator-744455d44c-t4rf8" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.229347 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jgbvn" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.230005 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.230612 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj8f5\" (UniqueName: \"kubernetes.io/projected/85b3681c-313d-40d1-b1f9-c8410c81dc20-kube-api-access-qj8f5\") pod \"marketplace-operator-79b997595-9g9km\" (UID: \"85b3681c-313d-40d1-b1f9-c8410c81dc20\") " pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" Nov 22 02:56:19 crc kubenswrapper[4952]: E1122 02:56:19.231080 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:19.731059075 +0000 UTC m=+144.037076348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.231563 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:19 crc kubenswrapper[4952]: E1122 02:56:19.232841 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:19.731918056 +0000 UTC m=+144.037935329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.236197 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8zdf\" (UniqueName: \"kubernetes.io/projected/a57045cb-1c86-432e-ae6f-2f973ce52596-kube-api-access-j8zdf\") pod \"migrator-59844c95c7-8j98h\" (UID: \"a57045cb-1c86-432e-ae6f-2f973ce52596\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8j98h" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.264688 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rlp29" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.277358 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hcvgc"] Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.277423 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wwch7"] Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.278918 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rglbf\" (UniqueName: \"kubernetes.io/projected/4c07437b-bd0d-4569-b4a4-ff08b56f4a23-kube-api-access-rglbf\") pod \"packageserver-d55dfcdfc-fch9k\" (UID: \"4c07437b-bd0d-4569-b4a4-ff08b56f4a23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.281516 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/073b4a27-3e98-4d0d-a2b7-62a89a434907-bound-sa-token\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.288253 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zmfcx"] Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.322469 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqb52\" (UniqueName: \"kubernetes.io/projected/40f15d13-2eba-4271-bfd6-6c4b32f77ea2-kube-api-access-lqb52\") pod \"machine-config-server-hh2wf\" (UID: \"40f15d13-2eba-4271-bfd6-6c4b32f77ea2\") " pod="openshift-machine-config-operator/machine-config-server-hh2wf" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.332882 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:19 crc kubenswrapper[4952]: E1122 02:56:19.333489 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:19.833458602 +0000 UTC m=+144.139475875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.357643 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kdk4" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.362872 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8j98h" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.364501 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8ngds" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.367087 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf6ht\" (UniqueName: \"kubernetes.io/projected/c1faf425-0d47-41a5-8727-7f8f5d74f8b9-kube-api-access-vf6ht\") pod \"ingress-canary-m2hvn\" (UID: \"c1faf425-0d47-41a5-8727-7f8f5d74f8b9\") " pod="openshift-ingress-canary/ingress-canary-m2hvn" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.377372 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-t4rf8" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.383903 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6nsx4"] Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.386349 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.387476 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v8tw\" (UniqueName: \"kubernetes.io/projected/867a3e20-6f2f-4dfe-a378-e6357d6c19e3-kube-api-access-7v8tw\") pod \"dns-default-fhpzk\" (UID: \"867a3e20-6f2f-4dfe-a378-e6357d6c19e3\") " pod="openshift-dns/dns-default-fhpzk" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.389081 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbbwz\" (UniqueName: \"kubernetes.io/projected/028b56f2-cfad-4db5-81ab-fa866f42f9c3-kube-api-access-nbbwz\") pod \"csi-hostpathplugin-94k47\" (UID: \"028b56f2-cfad-4db5-81ab-fa866f42f9c3\") " pod="hostpath-provisioner/csi-hostpathplugin-94k47" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.394901 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-kkdb8"] Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.414694 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2c9gx" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.419903 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" event={"ID":"b72acca1-8338-4e9c-9fa5-55616766e8a9","Type":"ContainerStarted","Data":"4cf0761a070abc5d382fd4aeaa0855b7d43e8557c808c880300a534b96eedd34"} Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.422920 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.436790 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9wz27" event={"ID":"12981135-5b52-464d-8690-e571eb306507","Type":"ContainerStarted","Data":"5d69ea56e522525ea6af23848bf52e86f544b17910e534049ac6c1050f139389"} Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.437768 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:19 crc kubenswrapper[4952]: E1122 02:56:19.438067 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:19.938055446 +0000 UTC m=+144.244072709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.438391 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5" event={"ID":"bc8974e3-11f8-4820-ac5b-d70b337ecd4c","Type":"ContainerStarted","Data":"21c0c8f3e898dc54d49ff3703a7b70c9f6596c35add5d24b58dadecfaf80bced"} Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.445154 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" event={"ID":"7f436461-6865-411d-9c2d-8c5794d1b4ab","Type":"ContainerStarted","Data":"f55d16377dd4184ec15a3dd324ca0b779fe23b8c726f77c9ea0d082f6745ee75"} Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.446002 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8qgqp" event={"ID":"ef466a9b-34cc-4282-ad35-96731b58b8c3","Type":"ContainerStarted","Data":"0351886acc7b612c20c90f2d3cc4768bba462664e364d1e73c3d88a80afb2540"} Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.446809 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" event={"ID":"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e","Type":"ContainerStarted","Data":"8d1ebf7c3495dec7892d17673cb7f5a7ecbe4910a1202567449676018cd9f73a"} Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.450932 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vg8j9" event={"ID":"f4f6bb21-bb06-4b91-a1ff-15b596f1f92f","Type":"ContainerStarted","Data":"beefbee54f1610c55faa4b21cf3e9132d8b4303b306016454cc1cc905175b1fd"} Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.450951 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vg8j9" event={"ID":"f4f6bb21-bb06-4b91-a1ff-15b596f1f92f","Type":"ContainerStarted","Data":"c3930aebe78687314949ee16e3fc902a8ef9b1f5eebf6c1bc5f954b2a3828016"} Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.452945 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-vg8j9" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.454806 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wwch7" event={"ID":"534f7f1e-7321-49f1-8d68-7b356c28b8ac","Type":"ContainerStarted","Data":"20e27633678502a91a5a34eac7e7d8c1a253975420542c4efdedc377d213a1b8"} Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.455342 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-kkh7f" event={"ID":"3dda8f37-91e7-4ddc-bd94-8caa6f422c7c","Type":"ContainerStarted","Data":"5072844a13dc018f84362a305262b6d31b28e21e074ae45f8bd21895fbd714c3"} Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.455919 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zmfcx" event={"ID":"ce050654-a42c-4472-9990-581502ae1830","Type":"ContainerStarted","Data":"df1ac55177c954a379fa98a2f362bdb5f00b7add31917d15ae1543ca5e7c2353"} Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.469132 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fhpzk" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.477606 4952 patch_prober.go:28] interesting pod/console-operator-58897d9998-vg8j9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.477691 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vg8j9" podUID="f4f6bb21-bb06-4b91-a1ff-15b596f1f92f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.478755 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" event={"ID":"63bdcada-d1ad-45eb-b290-42b2b8dd8257","Type":"ContainerStarted","Data":"f2edcfe1fb5697c939e663e602f25f42e12bda9d287671aa108b49f40818da8e"} Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.478824 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" event={"ID":"63bdcada-d1ad-45eb-b290-42b2b8dd8257","Type":"ContainerStarted","Data":"5cff258cd5e1536d199a9b38da4c7e36d6cb69dbfd7ad6447d6e7b2d3524c968"} Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.480282 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.480878 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" event={"ID":"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276","Type":"ContainerStarted","Data":"f7cf6d585b72e1d6431f744e6c731c1821898500693a8779619ec3a8460b7e21"} Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.484259 4952 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-z6kn5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.484329 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" podUID="63bdcada-d1ad-45eb-b290-42b2b8dd8257" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.489192 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-94k47" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.495170 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m2hvn" Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.505955 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hh2wf" Nov 22 02:56:19 crc kubenswrapper[4952]: W1122 02:56:19.525136 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ade98cb_8582_4066_b635_e837a190302d.slice/crio-80dd77738364632c351184cbfa7ad872b4284bd272f761fc372698c6dfe1376b WatchSource:0}: Error finding container 80dd77738364632c351184cbfa7ad872b4284bd272f761fc372698c6dfe1376b: Status 404 returned error can't find the container with id 80dd77738364632c351184cbfa7ad872b4284bd272f761fc372698c6dfe1376b Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.538812 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.539421 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k75vw"] Nov 22 02:56:19 crc kubenswrapper[4952]: E1122 02:56:19.540259 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:20.040237538 +0000 UTC m=+144.346254871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.584865 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mz7ld"] Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.640691 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:19 crc kubenswrapper[4952]: E1122 02:56:19.641153 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:20.141139577 +0000 UTC m=+144.447156850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.642378 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sjc2j"] Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.649826 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-89rxq"] Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.661463 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x"] Nov 22 02:56:19 crc kubenswrapper[4952]: W1122 02:56:19.670339 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8b36a8f_760f_47c0_a090_c1f8c8ac44c5.slice/crio-5e3041757a3f4a27a5a55deddf834fc8f1e674f7d4545cd6ae8d55b7e0e45650 WatchSource:0}: Error finding container 5e3041757a3f4a27a5a55deddf834fc8f1e674f7d4545cd6ae8d55b7e0e45650: Status 404 returned error can't find the container with id 5e3041757a3f4a27a5a55deddf834fc8f1e674f7d4545cd6ae8d55b7e0e45650 Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.743675 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:19 crc kubenswrapper[4952]: E1122 02:56:19.744685 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:20.244647162 +0000 UTC m=+144.550664435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.849362 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:19 crc kubenswrapper[4952]: E1122 02:56:19.849973 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:20.349953184 +0000 UTC m=+144.655970457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:19 crc kubenswrapper[4952]: I1122 02:56:19.950309 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:19 crc kubenswrapper[4952]: E1122 02:56:19.950724 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:20.450701959 +0000 UTC m=+144.756719242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.050583 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-65gk2"] Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.053232 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:20 crc kubenswrapper[4952]: E1122 02:56:20.055935 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:20.555913678 +0000 UTC m=+144.861930951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.059984 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml"] Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.068703 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sw6q8"] Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.085846 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-2ggzk"] Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.154075 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:20 crc kubenswrapper[4952]: E1122 02:56:20.154465 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:20.654448086 +0000 UTC m=+144.960465359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:20 crc kubenswrapper[4952]: W1122 02:56:20.191485 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c7c8cee_9a36_4dad_9a04_3c30fe4a7bed.slice/crio-0cf132a1f7784a13c0c8589bf1e9912fadb747d61045968258bf6874a7f813b2 WatchSource:0}: Error finding container 0cf132a1f7784a13c0c8589bf1e9912fadb747d61045968258bf6874a7f813b2: Status 404 returned error can't find the container with id 0cf132a1f7784a13c0c8589bf1e9912fadb747d61045968258bf6874a7f813b2 Nov 22 02:56:20 crc kubenswrapper[4952]: E1122 02:56:20.256986 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:20.756968266 +0000 UTC m=+145.062985539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.258136 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.379108 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m5hc7"] Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.381514 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5gwx5"] Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.382947 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq"] Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.393822 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:20 crc kubenswrapper[4952]: E1122 02:56:20.395864 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:20.895848831 +0000 UTC m=+145.201866104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.421275 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rlp29"] Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.444142 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9jtzv"] Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.469097 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jgbvn"] Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.500324 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:20 crc kubenswrapper[4952]: E1122 02:56:20.501209 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:21.001181583 +0000 UTC m=+145.307198856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.517589 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" event={"ID":"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e","Type":"ContainerStarted","Data":"a628aa836cb3c11cc16f9c3073c2bc775c7cbd7705975312b4668113c9f4f03a"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.519864 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.522650 4952 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-vpkgq container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.522753 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" podUID="05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.523856 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sjc2j" event={"ID":"8a0cf8dd-fc29-442b-9ff2-7360946df755","Type":"ContainerStarted","Data":"00cc82cc1122f236bdebc39f9a7d43cbbcbd10c34389a33c16f7215d32da79ea"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.589916 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-89rxq" event={"ID":"aae47c6e-1d61-40ec-851a-c3e5a6242dcc","Type":"ContainerStarted","Data":"a7c29d4b60a5a64f7b3449f0a707c1ae4c4a7216abb589f3d40baa8855edcc77"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.590043 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kdk4"] Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.605039 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" event={"ID":"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276","Type":"ContainerStarted","Data":"541c6f268d044648601b84df32a9f3d02ec82d32d5a0dbe03cee3f5eaa228b89"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.606435 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.612261 4952 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-fgnfx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.612343 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" podUID="3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.616135 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:20 crc kubenswrapper[4952]: E1122 02:56:20.617864 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:21.117840416 +0000 UTC m=+145.423857689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.629021 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wwch7" event={"ID":"534f7f1e-7321-49f1-8d68-7b356c28b8ac","Type":"ContainerStarted","Data":"3c0a0526214741922f2551d431f76e37e44a3316b16a3ffb4f94d7212f5a791a"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.630959 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8qgqp" event={"ID":"ef466a9b-34cc-4282-ad35-96731b58b8c3","Type":"ContainerStarted","Data":"73a68ab4618fcbb2e7667a2ddad598e1e700ddce575de59320ea7c5e85706b0d"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.632272 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zmfcx" event={"ID":"ce050654-a42c-4472-9990-581502ae1830","Type":"ContainerStarted","Data":"84fd728a5411c77344585f4b3c7eb6ac7261bf99db009d48d076bb86d7ea78e0"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.634002 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zmfcx" Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.634255 4952 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-zmfcx container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.634290 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zmfcx" podUID="ce050654-a42c-4472-9990-581502ae1830" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.636879 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m5hc7" event={"ID":"aee2ceff-d701-49e2-9dca-8edb0bd1d59e","Type":"ContainerStarted","Data":"70506a13dc26a6d5f01898324a361319d2929911f8d4c6da402c9ba8afdcb10a"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.639655 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x" event={"ID":"b46a19f6-7d04-44b1-a2ad-6146c66fb5e2","Type":"ContainerStarted","Data":"9bca99d6f455594bff80601a5aae66714d4ee87184cf1de5fd3f7056c4c33c27"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.668373 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hh2wf" event={"ID":"40f15d13-2eba-4271-bfd6-6c4b32f77ea2","Type":"ContainerStarted","Data":"83f93447ad23c430224c83d7a65cca0c588e753c18b1f99ff9aaf738b10bc694"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.677933 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-2ggzk" event={"ID":"7c7c8cee-9a36-4dad-9a04-3c30fe4a7bed","Type":"ContainerStarted","Data":"0cf132a1f7784a13c0c8589bf1e9912fadb747d61045968258bf6874a7f813b2"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.693613 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9wz27" event={"ID":"12981135-5b52-464d-8690-e571eb306507","Type":"ContainerStarted","Data":"4b232507cd5c40460fea583037f2c7a6f07d7a6e1cbbdc028829a85ef7db0470"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.711224 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5" event={"ID":"bc8974e3-11f8-4820-ac5b-d70b337ecd4c","Type":"ContainerStarted","Data":"474e584d13bf4ddc8be87ad12fe72e4e860ff2f48fbffdee43f43e7d46e5f8ff"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.711286 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8j98h"] Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.732460 4952 generic.go:334] "Generic (PLEG): container finished" podID="7f436461-6865-411d-9c2d-8c5794d1b4ab" containerID="54a2456796cdaa7557e6fa9e8b6afe9f007938ccd01d361cd8fee637c83beabb" exitCode=0 Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.732597 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" event={"ID":"7f436461-6865-411d-9c2d-8c5794d1b4ab","Type":"ContainerDied","Data":"54a2456796cdaa7557e6fa9e8b6afe9f007938ccd01d361cd8fee637c83beabb"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.737421 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:20 crc kubenswrapper[4952]: E1122 02:56:20.737944 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:21.237927407 +0000 UTC m=+145.543944680 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.744387 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hcvgc" event={"ID":"a4782fdd-4348-4995-af8f-eb6d61183dec","Type":"ContainerStarted","Data":"1cc001e3f141fa91d565da421531a183add591de18219345fab491d1932d3322"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.750528 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k"] Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.754668 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9g9km"] Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.754748 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8ngds"] Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.778788 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml" event={"ID":"85eed36b-26fa-4c39-a899-23262c4c1043","Type":"ContainerStarted","Data":"e478c90763f613ce0fe7bd886ba3eecebaf39a2b018f09333243407dd14409ef"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.790667 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fhpzk"] Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.792782 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2c9gx"] Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.804206 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-t4rf8"] Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.816992 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6nsx4" event={"ID":"7ade98cb-8582-4066-b635-e837a190302d","Type":"ContainerStarted","Data":"239b13489837f962ecab142527193d4aa79fb4532f0beecd448a8bf103c03aaa"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.817036 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6nsx4" event={"ID":"7ade98cb-8582-4066-b635-e837a190302d","Type":"ContainerStarted","Data":"80dd77738364632c351184cbfa7ad872b4284bd272f761fc372698c6dfe1376b"} Nov 22 02:56:20 crc kubenswrapper[4952]: W1122 02:56:20.826176 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda57045cb_1c86_432e_ae6f_2f973ce52596.slice/crio-b0a365731aaf169b285dc4a5f2194f27c5565485e618eebf88eb193d6d314e3e WatchSource:0}: Error finding container b0a365731aaf169b285dc4a5f2194f27c5565485e618eebf88eb193d6d314e3e: Status 404 returned error can't find the container with id b0a365731aaf169b285dc4a5f2194f27c5565485e618eebf88eb193d6d314e3e Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.826720 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k75vw" event={"ID":"c8b36a8f-760f-47c0-a090-c1f8c8ac44c5","Type":"ContainerStarted","Data":"a99ed277459ebdcc4351444d6e48658daa707edffd8c5a9f2c93e11114a9ac0b"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.826795 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k75vw" event={"ID":"c8b36a8f-760f-47c0-a090-c1f8c8ac44c5","Type":"ContainerStarted","Data":"5e3041757a3f4a27a5a55deddf834fc8f1e674f7d4545cd6ae8d55b7e0e45650"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.829048 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mz7ld" event={"ID":"e043e178-e7e5-4ddf-b561-7253433d6e81","Type":"ContainerStarted","Data":"facfe6cc075af9d72850d5f5368bad8dcb0d0b09b64cdba5eee1cdf03731cba1"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.829117 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mz7ld" event={"ID":"e043e178-e7e5-4ddf-b561-7253433d6e81","Type":"ContainerStarted","Data":"4596500b09e05969d43f23f24adf87364591d05fcc350cb9fd19bfaf076dc27c"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.839259 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:20 crc kubenswrapper[4952]: E1122 02:56:20.841398 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:21.341369791 +0000 UTC m=+145.647387094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.850935 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" event={"ID":"2f5ae1d2-b361-49e2-9460-b447f70a4cd3","Type":"ContainerStarted","Data":"ee9385dedfcf8f9a8cb7d471e6bb9331c85d752bcc42db6443f8dab78a869bbe"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.855873 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sw6q8" event={"ID":"7f15cd27-e587-4db6-8fcd-b5b2cd559656","Type":"ContainerStarted","Data":"7a6e83c598f3db31777c05db571e25ee95409dd877b8bf1e101a06fd94a9521c"} Nov 22 02:56:20 crc kubenswrapper[4952]: W1122 02:56:20.872633 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c07437b_bd0d_4569_b4a4_ff08b56f4a23.slice/crio-856219422f1ab66258e2f1ba84b28779aecb2f9bac338b2473ac541bde198f55 WatchSource:0}: Error finding container 856219422f1ab66258e2f1ba84b28779aecb2f9bac338b2473ac541bde198f55: Status 404 returned error can't find the container with id 856219422f1ab66258e2f1ba84b28779aecb2f9bac338b2473ac541bde198f55 Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.873304 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" event={"ID":"b72acca1-8338-4e9c-9fa5-55616766e8a9","Type":"ContainerStarted","Data":"dc12d4a82c1fbdce67e3e676a20f33ae143deb890a5584a3f48714bb4145e698"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.875816 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-kkh7f" event={"ID":"3dda8f37-91e7-4ddc-bd94-8caa6f422c7c","Type":"ContainerStarted","Data":"a861ec164f90850681d674c8465bf91459681e067a86a8b18177e391c50a3b3a"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.879569 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-kkdb8" event={"ID":"af11e3ed-3c58-4ad5-9da7-38b9950ff726","Type":"ContainerStarted","Data":"359d3d6e8aff27ff51635e48b4aabf8df1a6704a74cf0513e22759788d01ebc5"} Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.880352 4952 patch_prober.go:28] interesting pod/console-operator-58897d9998-vg8j9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.880422 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vg8j9" podUID="f4f6bb21-bb06-4b91-a1ff-15b596f1f92f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.880521 4952 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-z6kn5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.880601 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" podUID="63bdcada-d1ad-45eb-b290-42b2b8dd8257" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.881170 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-kkdb8" Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.890092 4952 patch_prober.go:28] interesting pod/downloads-7954f5f757-kkdb8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.890143 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kkdb8" podUID="af11e3ed-3c58-4ad5-9da7-38b9950ff726" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.894718 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-94k47"] Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.914746 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m2hvn"] Nov 22 02:56:20 crc kubenswrapper[4952]: W1122 02:56:20.919080 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0893cff8_0528_4b35_b1f9_faa91e42e5a5.slice/crio-3ec9d91094b1829c4cfcb43c411535205b46b54c29268af58b38fda913405581 WatchSource:0}: Error finding container 3ec9d91094b1829c4cfcb43c411535205b46b54c29268af58b38fda913405581: Status 404 returned error can't find the container with id 3ec9d91094b1829c4cfcb43c411535205b46b54c29268af58b38fda913405581 Nov 22 02:56:20 crc kubenswrapper[4952]: I1122 02:56:20.941687 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:20 crc kubenswrapper[4952]: E1122 02:56:20.943215 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:21.443190194 +0000 UTC m=+145.749207467 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.039184 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.043148 4952 patch_prober.go:28] interesting pod/router-default-5444994796-9wz27 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.043199 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wz27" podUID="12981135-5b52-464d-8690-e571eb306507" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.043359 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:21 crc kubenswrapper[4952]: E1122 02:56:21.045837 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:21.545812237 +0000 UTC m=+145.851829510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.056174 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" podStartSLOduration=124.056149752 podStartE2EDuration="2m4.056149752s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:21.052312944 +0000 UTC m=+145.358330227" watchObservedRunningTime="2025-11-22 02:56:21.056149752 +0000 UTC m=+145.362167025" Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.085852 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-vg8j9" podStartSLOduration=124.085830653 podStartE2EDuration="2m4.085830653s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:21.085493385 +0000 UTC m=+145.391510668" watchObservedRunningTime="2025-11-22 02:56:21.085830653 +0000 UTC m=+145.391847936" Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.145883 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:21 crc kubenswrapper[4952]: E1122 02:56:21.146176 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:21.646163842 +0000 UTC m=+145.952181115 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.163270 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k75vw" podStartSLOduration=124.16324704 podStartE2EDuration="2m4.16324704s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:21.161787533 +0000 UTC m=+145.467804806" watchObservedRunningTime="2025-11-22 02:56:21.16324704 +0000 UTC m=+145.469264313" Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.210983 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-kkdb8" podStartSLOduration=124.210957954 podStartE2EDuration="2m4.210957954s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:21.207104035 +0000 UTC m=+145.513121318" watchObservedRunningTime="2025-11-22 02:56:21.210957954 +0000 UTC m=+145.516975227" Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.251038 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:21 crc kubenswrapper[4952]: E1122 02:56:21.251523 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:21.751501294 +0000 UTC m=+146.057518557 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.258484 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-9wz27" podStartSLOduration=124.258456753 podStartE2EDuration="2m4.258456753s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:21.258189286 +0000 UTC m=+145.564206559" watchObservedRunningTime="2025-11-22 02:56:21.258456753 +0000 UTC m=+145.564474026" Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.296186 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-kkh7f" podStartSLOduration=124.29613475 podStartE2EDuration="2m4.29613475s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:21.290206477 +0000 UTC m=+145.596223750" watchObservedRunningTime="2025-11-22 02:56:21.29613475 +0000 UTC m=+145.602152013" Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.334876 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" podStartSLOduration=124.334851073 podStartE2EDuration="2m4.334851073s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:21.333053547 +0000 UTC m=+145.639070820" watchObservedRunningTime="2025-11-22 02:56:21.334851073 +0000 UTC m=+145.640868346" Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.352833 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:21 crc kubenswrapper[4952]: E1122 02:56:21.353240 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:21.853227344 +0000 UTC m=+146.159244617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.374727 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" podStartSLOduration=124.374707616 podStartE2EDuration="2m4.374707616s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:21.372846687 +0000 UTC m=+145.678863990" watchObservedRunningTime="2025-11-22 02:56:21.374707616 +0000 UTC m=+145.680724889" Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.408797 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wwch7" podStartSLOduration=124.408771159 podStartE2EDuration="2m4.408771159s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:21.407591249 +0000 UTC m=+145.713608542" watchObservedRunningTime="2025-11-22 02:56:21.408771159 +0000 UTC m=+145.714788432" Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.455051 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:21 crc kubenswrapper[4952]: E1122 02:56:21.455641 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:21.955614592 +0000 UTC m=+146.261631865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.488609 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zmfcx" podStartSLOduration=124.488521446 podStartE2EDuration="2m4.488521446s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:21.486792591 +0000 UTC m=+145.792809864" watchObservedRunningTime="2025-11-22 02:56:21.488521446 +0000 UTC m=+145.794538709" Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.531351 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" podStartSLOduration=124.531328954 podStartE2EDuration="2m4.531328954s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:21.530613656 +0000 UTC m=+145.836630939" watchObservedRunningTime="2025-11-22 02:56:21.531328954 +0000 UTC m=+145.837346227" Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.557021 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:21 crc kubenswrapper[4952]: E1122 02:56:21.557446 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:22.057432013 +0000 UTC m=+146.363449286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.572668 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6nsx4" podStartSLOduration=124.572636944 podStartE2EDuration="2m4.572636944s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:21.571232448 +0000 UTC m=+145.877249721" watchObservedRunningTime="2025-11-22 02:56:21.572636944 +0000 UTC m=+145.878654227" Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.658424 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:21 crc kubenswrapper[4952]: E1122 02:56:21.658644 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:22.15860523 +0000 UTC m=+146.464622493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.659116 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:21 crc kubenswrapper[4952]: E1122 02:56:21.659629 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:22.159611535 +0000 UTC m=+146.465628808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.760138 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:21 crc kubenswrapper[4952]: E1122 02:56:21.760943 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:22.260922695 +0000 UTC m=+146.566939968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.865242 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:21 crc kubenswrapper[4952]: E1122 02:56:21.867164 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:22.36714191 +0000 UTC m=+146.673159183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.974892 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:21 crc kubenswrapper[4952]: E1122 02:56:21.975019 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:22.474982798 +0000 UTC m=+146.781000071 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:21 crc kubenswrapper[4952]: I1122 02:56:21.975378 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:21 crc kubenswrapper[4952]: E1122 02:56:21.975828 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:22.475819828 +0000 UTC m=+146.781837101 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.025441 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hh2wf" event={"ID":"40f15d13-2eba-4271-bfd6-6c4b32f77ea2","Type":"ContainerStarted","Data":"eb5a0e535a5dd4835a79d311588575e3f93e012f0f845d499e3c5207f15661ac"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.039715 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kdk4" event={"ID":"d26d8bc9-7ecf-4bb4-9a6b-b815722b0e72","Type":"ContainerStarted","Data":"b54fd9261f02e435caba16e7bc74ed45d63cbcb66ee1f45661fd74cd4623199a"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.039772 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kdk4" event={"ID":"d26d8bc9-7ecf-4bb4-9a6b-b815722b0e72","Type":"ContainerStarted","Data":"489aa313c964f6dc0bce2553283e93565d7d5ae4c2c10c8d69b1216aa62b1f72"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.046183 4952 patch_prober.go:28] interesting pod/router-default-5444994796-9wz27 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 02:56:22 crc kubenswrapper[4952]: [-]has-synced failed: reason withheld Nov 22 02:56:22 crc kubenswrapper[4952]: [+]process-running ok Nov 22 02:56:22 crc kubenswrapper[4952]: healthz check failed Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.046579 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wz27" podUID="12981135-5b52-464d-8690-e571eb306507" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.062168 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8ngds" event={"ID":"a420a904-6165-4f7a-a29e-3c5549e5cec5","Type":"ContainerStarted","Data":"76bad258b67dbb4ac7fb2b4298545f563da015cb1869d7d3efd9e6b28307113f"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.071205 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-hh2wf" podStartSLOduration=6.071182465 podStartE2EDuration="6.071182465s" podCreationTimestamp="2025-11-22 02:56:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:22.067226813 +0000 UTC m=+146.373244106" watchObservedRunningTime="2025-11-22 02:56:22.071182465 +0000 UTC m=+146.377199738" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.079221 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:22 crc kubenswrapper[4952]: E1122 02:56:22.079790 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:22.579768195 +0000 UTC m=+146.885785468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.106762 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kdk4" podStartSLOduration=125.106743388 podStartE2EDuration="2m5.106743388s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:22.104350666 +0000 UTC m=+146.410367939" watchObservedRunningTime="2025-11-22 02:56:22.106743388 +0000 UTC m=+146.412760661" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.113482 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.114001 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.146222 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8qgqp" event={"ID":"ef466a9b-34cc-4282-ad35-96731b58b8c3","Type":"ContainerStarted","Data":"2f0571924aa768f846198700072218e08441e4eae7d9e80411713fe07b738b7f"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.171861 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sjc2j" event={"ID":"8a0cf8dd-fc29-442b-9ff2-7360946df755","Type":"ContainerStarted","Data":"edcc9b378248abedd6e327e1aa3bd37df8f39f45aceca37fd16cc54a7100a02c"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.178595 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml" event={"ID":"85eed36b-26fa-4c39-a899-23262c4c1043","Type":"ContainerStarted","Data":"f42fa3ca570122171b0b90e8c28993d66894d1c069c45e0855cee37f20c0ab0a"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.180792 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:22 crc kubenswrapper[4952]: E1122 02:56:22.182355 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:22.682343588 +0000 UTC m=+146.988360851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.194994 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m2hvn" event={"ID":"c1faf425-0d47-41a5-8727-7f8f5d74f8b9","Type":"ContainerStarted","Data":"27b3a9348d9238f2fc5cacc77ad01c47521d31042ea13cab12e2d12da8401451"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.210690 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8qgqp" podStartSLOduration=125.210659224 podStartE2EDuration="2m5.210659224s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:22.199970649 +0000 UTC m=+146.505987922" watchObservedRunningTime="2025-11-22 02:56:22.210659224 +0000 UTC m=+146.516676517" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.245752 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x" event={"ID":"b46a19f6-7d04-44b1-a2ad-6146c66fb5e2","Type":"ContainerStarted","Data":"aad6efaa9f9ab236d9a3b8ad6f34ec1dac3c9617e3be635cf6a9ba0b14ea206d"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.286978 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:22 crc kubenswrapper[4952]: E1122 02:56:22.289361 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:22.789328862 +0000 UTC m=+147.095346295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.310663 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq" event={"ID":"ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb","Type":"ContainerStarted","Data":"80c62495a18d69e9ac1d4e770632cb5979f0f7250b4f5314558c4e4465783306"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.310720 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq" event={"ID":"ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb","Type":"ContainerStarted","Data":"ab86f2eeb807d0e33b741ba76c1534159c4e451e42437a043a26c67084193b34"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.359836 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x" podStartSLOduration=125.359812681 podStartE2EDuration="2m5.359812681s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:22.354029472 +0000 UTC m=+146.660046745" watchObservedRunningTime="2025-11-22 02:56:22.359812681 +0000 UTC m=+146.665829954" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.360739 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sjc2j" podStartSLOduration=125.360731534 podStartE2EDuration="2m5.360731534s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:22.28494934 +0000 UTC m=+146.590966753" watchObservedRunningTime="2025-11-22 02:56:22.360731534 +0000 UTC m=+146.666748807" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.372001 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2c9gx" event={"ID":"6630a2b1-f78c-4c1e-8511-6bb8dc615362","Type":"ContainerStarted","Data":"578e30ad1f000209b2ab51101586ea227a806a92e5f2da941650129d92744c5c"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.373109 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2c9gx" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.386631 4952 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-2c9gx container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.386690 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2c9gx" podUID="6630a2b1-f78c-4c1e-8511-6bb8dc615362" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.389199 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:22 crc kubenswrapper[4952]: E1122 02:56:22.390752 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:22.890734944 +0000 UTC m=+147.196752217 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.490158 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:22 crc kubenswrapper[4952]: E1122 02:56:22.490380 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:22.990356991 +0000 UTC m=+147.296374264 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.492171 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:22 crc kubenswrapper[4952]: E1122 02:56:22.493031 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:22.993022369 +0000 UTC m=+147.299039642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.551110 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-89rxq" event={"ID":"aae47c6e-1d61-40ec-851a-c3e5a6242dcc","Type":"ContainerStarted","Data":"cef0ad02145738f10f5b670bc68c23af29352bef69e6a23a56c736032313fe9b"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.594378 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.594834 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hcvgc" event={"ID":"a4782fdd-4348-4995-af8f-eb6d61183dec","Type":"ContainerStarted","Data":"a149c6fe083490a89b37ae26c1c09efc8a5e0aa18beb70041dc7286b1504083d"} Nov 22 02:56:22 crc kubenswrapper[4952]: E1122 02:56:22.595070 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:23.095043036 +0000 UTC m=+147.401060309 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.615192 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2c9gx" podStartSLOduration=125.615174193 podStartE2EDuration="2m5.615174193s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:22.614505725 +0000 UTC m=+146.920523008" watchObservedRunningTime="2025-11-22 02:56:22.615174193 +0000 UTC m=+146.921191466" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.616033 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq" podStartSLOduration=125.616026594 podStartE2EDuration="2m5.616026594s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:22.488357359 +0000 UTC m=+146.794374642" watchObservedRunningTime="2025-11-22 02:56:22.616026594 +0000 UTC m=+146.922043857" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.618914 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8j98h" event={"ID":"a57045cb-1c86-432e-ae6f-2f973ce52596","Type":"ContainerStarted","Data":"b0a365731aaf169b285dc4a5f2194f27c5565485e618eebf88eb193d6d314e3e"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.625422 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fhpzk" event={"ID":"867a3e20-6f2f-4dfe-a378-e6357d6c19e3","Type":"ContainerStarted","Data":"280ab8315b89c5049c1d38b6a4b847b9bcd4d688924dc893bdace42cad40ea43"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.635250 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k" event={"ID":"4c07437b-bd0d-4569-b4a4-ff08b56f4a23","Type":"ContainerStarted","Data":"cc0c8617b28104736a1a17c63bc66105ce5a00d48d6ebdff5b31d35c8cabf690"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.635302 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k" event={"ID":"4c07437b-bd0d-4569-b4a4-ff08b56f4a23","Type":"ContainerStarted","Data":"856219422f1ab66258e2f1ba84b28779aecb2f9bac338b2473ac541bde198f55"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.636389 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.643081 4952 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fch9k container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.643210 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k" podUID="4c07437b-bd0d-4569-b4a4-ff08b56f4a23" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.664874 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" event={"ID":"85b3681c-313d-40d1-b1f9-c8410c81dc20","Type":"ContainerStarted","Data":"5de63b06bcb96d31a642cca322877f5dc11ac3eeff4102c405c4a22def21174b"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.666010 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.673835 4952 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9g9km container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.673919 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" podUID="85b3681c-313d-40d1-b1f9-c8410c81dc20" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.682836 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rlp29" event={"ID":"ec0d0a12-4c75-4177-9e76-26baecebbf14","Type":"ContainerStarted","Data":"e36a14e4bba3b586a62529a4e510f8a9668007359a10aa6f27735916c2193cb3"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.682904 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rlp29" event={"ID":"ec0d0a12-4c75-4177-9e76-26baecebbf14","Type":"ContainerStarted","Data":"0e4a5c2373423320f747ee18aa199e8e6ecf51ee4ccc8dc5da16744e6d8c0a41"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.696744 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:22 crc kubenswrapper[4952]: E1122 02:56:22.698934 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:23.198917442 +0000 UTC m=+147.504934705 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.719944 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hcvgc" podStartSLOduration=125.719919681 podStartE2EDuration="2m5.719919681s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:22.717387736 +0000 UTC m=+147.023405009" watchObservedRunningTime="2025-11-22 02:56:22.719919681 +0000 UTC m=+147.025936954" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.735940 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" event={"ID":"2f5ae1d2-b361-49e2-9460-b447f70a4cd3","Type":"ContainerStarted","Data":"cff6e588889056d22ee310b2769f940a47ed8736cafbccf6f9780868ef8d8a9d"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.758949 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-t4rf8" event={"ID":"0893cff8-0528-4b35-b1f9-faa91e42e5a5","Type":"ContainerStarted","Data":"3ec9d91094b1829c4cfcb43c411535205b46b54c29268af58b38fda913405581"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.785647 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-kkdb8" event={"ID":"af11e3ed-3c58-4ad5-9da7-38b9950ff726","Type":"ContainerStarted","Data":"a4ef57440a012753d651a16d44204deffae69a5c7ce4c343117d8a9518188756"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.790761 4952 patch_prober.go:28] interesting pod/downloads-7954f5f757-kkdb8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.790805 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kkdb8" podUID="af11e3ed-3c58-4ad5-9da7-38b9950ff726" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.808114 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:22 crc kubenswrapper[4952]: E1122 02:56:22.808504 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:23.308478543 +0000 UTC m=+147.614495816 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.809743 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:22 crc kubenswrapper[4952]: E1122 02:56:22.810147 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:23.310138795 +0000 UTC m=+147.616156068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.828764 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sw6q8" event={"ID":"7f15cd27-e587-4db6-8fcd-b5b2cd559656","Type":"ContainerStarted","Data":"36706f179051c96251759c5e938656e636087ebb5039f02fe8dca4da36351274"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.829852 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sw6q8" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.870433 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-89rxq" podStartSLOduration=125.870413912 podStartE2EDuration="2m5.870413912s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:22.77992447 +0000 UTC m=+147.085941753" watchObservedRunningTime="2025-11-22 02:56:22.870413912 +0000 UTC m=+147.176431185" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.871915 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-65gk2" podStartSLOduration=125.8719089 podStartE2EDuration="2m5.8719089s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:22.870063393 +0000 UTC m=+147.176080676" watchObservedRunningTime="2025-11-22 02:56:22.8719089 +0000 UTC m=+147.177926173" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.884429 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-2ggzk" event={"ID":"7c7c8cee-9a36-4dad-9a04-3c30fe4a7bed","Type":"ContainerStarted","Data":"3ab457378ef65d5be3067f6659a6928c2311274b886d790f59aecec49fdc7afb"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.919955 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:22 crc kubenswrapper[4952]: E1122 02:56:22.924656 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:23.424622693 +0000 UTC m=+147.730639966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.938681 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mz7ld" event={"ID":"e043e178-e7e5-4ddf-b561-7253433d6e81","Type":"ContainerStarted","Data":"1e87078bfb59684dc1751a2a1047a8f5e5db9781bdaf8d3c9d55be14c66024aa"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.942445 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rlp29" podStartSLOduration=125.942425139 podStartE2EDuration="2m5.942425139s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:22.939523324 +0000 UTC m=+147.245540617" watchObservedRunningTime="2025-11-22 02:56:22.942425139 +0000 UTC m=+147.248442412" Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.982782 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9jtzv" event={"ID":"6f481093-b909-4014-a103-2a655003e144","Type":"ContainerStarted","Data":"c283ba640e5dccdab3bbc24b8789eae659722dd018bf59418f15c949421dee27"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.982843 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9jtzv" event={"ID":"6f481093-b909-4014-a103-2a655003e144","Type":"ContainerStarted","Data":"eece838e05b6098f780407cc7befc5ea42663e83d27b5305a19d80db79d69073"} Nov 22 02:56:22 crc kubenswrapper[4952]: I1122 02:56:22.991509 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k" podStartSLOduration=125.991490399 podStartE2EDuration="2m5.991490399s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:22.990123744 +0000 UTC m=+147.296141047" watchObservedRunningTime="2025-11-22 02:56:22.991490399 +0000 UTC m=+147.297507672" Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.022904 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-94k47" event={"ID":"028b56f2-cfad-4db5-81ab-fa866f42f9c3","Type":"ContainerStarted","Data":"b52bb8bac45fb8e417bbc01c44872d4264858c3762807127773642ea78caf167"} Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.024279 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:23 crc kubenswrapper[4952]: E1122 02:56:23.025703 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:23.525688036 +0000 UTC m=+147.831705309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.042228 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" podStartSLOduration=126.04220678 podStartE2EDuration="2m6.04220678s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:23.041202674 +0000 UTC m=+147.347219947" watchObservedRunningTime="2025-11-22 02:56:23.04220678 +0000 UTC m=+147.348224053" Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.052775 4952 patch_prober.go:28] interesting pod/router-default-5444994796-9wz27 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 02:56:23 crc kubenswrapper[4952]: [-]has-synced failed: reason withheld Nov 22 02:56:23 crc kubenswrapper[4952]: [+]process-running ok Nov 22 02:56:23 crc kubenswrapper[4952]: healthz check failed Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.053018 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wz27" podUID="12981135-5b52-464d-8690-e571eb306507" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.062525 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jgbvn" event={"ID":"0449fcf2-652a-4be5-957e-ecf47ef86668","Type":"ContainerStarted","Data":"b415d0ce4f0afdc997f582b09f425d4eee6c2bf339f02d95978331cfa4c9de43"} Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.062827 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jgbvn" event={"ID":"0449fcf2-652a-4be5-957e-ecf47ef86668","Type":"ContainerStarted","Data":"2b82371aefe55530a4d85be7bf90023cca1d053bc618e907b389c40442205cbc"} Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.101832 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5gwx5" event={"ID":"9aae38d4-efc5-4f2b-acdf-0d3a607b54a9","Type":"ContainerStarted","Data":"5452dcf85375683fa995d72a4b35f89bfc340de090b38d1de1d7f2bbae0ae075"} Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.130310 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:23 crc kubenswrapper[4952]: E1122 02:56:23.130777 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:23.630732791 +0000 UTC m=+147.936750094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.135010 4952 generic.go:334] "Generic (PLEG): container finished" podID="bc8974e3-11f8-4820-ac5b-d70b337ecd4c" containerID="474e584d13bf4ddc8be87ad12fe72e4e860ff2f48fbffdee43f43e7d46e5f8ff" exitCode=0 Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.135473 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5" event={"ID":"bc8974e3-11f8-4820-ac5b-d70b337ecd4c","Type":"ContainerDied","Data":"474e584d13bf4ddc8be87ad12fe72e4e860ff2f48fbffdee43f43e7d46e5f8ff"} Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.135534 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5" event={"ID":"bc8974e3-11f8-4820-ac5b-d70b337ecd4c","Type":"ContainerStarted","Data":"b836294938aabb8d06ba1f830c1e3feb0c68d62fb581b31532f2f94298c2c387"} Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.135729 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5" Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.139014 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m5hc7" event={"ID":"aee2ceff-d701-49e2-9dca-8edb0bd1d59e","Type":"ContainerStarted","Data":"cf3816fddec28e2476caad23cf759efda15608a9a691483371b49855f955190d"} Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.159899 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-vg8j9" Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.159963 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.171071 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.186476 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zmfcx" Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.211033 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-mz7ld" podStartSLOduration=126.21101233 podStartE2EDuration="2m6.21101233s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:23.112589995 +0000 UTC m=+147.418607288" watchObservedRunningTime="2025-11-22 02:56:23.21101233 +0000 UTC m=+147.517029603" Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.232944 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:23 crc kubenswrapper[4952]: E1122 02:56:23.237640 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:23.737619203 +0000 UTC m=+148.043636486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.284448 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9jtzv" podStartSLOduration=126.284422445 podStartE2EDuration="2m6.284422445s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:23.221224613 +0000 UTC m=+147.527241886" watchObservedRunningTime="2025-11-22 02:56:23.284422445 +0000 UTC m=+147.590439718" Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.285813 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sw6q8" podStartSLOduration=126.28580831 podStartE2EDuration="2m6.28580831s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:23.284870656 +0000 UTC m=+147.590887929" watchObservedRunningTime="2025-11-22 02:56:23.28580831 +0000 UTC m=+147.591825583" Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.335345 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:23 crc kubenswrapper[4952]: E1122 02:56:23.335789 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:23.835758151 +0000 UTC m=+148.141775424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.392523 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-2ggzk" podStartSLOduration=126.392505537 podStartE2EDuration="2m6.392505537s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:23.317887673 +0000 UTC m=+147.623904946" watchObservedRunningTime="2025-11-22 02:56:23.392505537 +0000 UTC m=+147.698522810" Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.437222 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:23 crc kubenswrapper[4952]: E1122 02:56:23.438574 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:23.938557989 +0000 UTC m=+148.244575262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.526686 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5" podStartSLOduration=126.52666993 podStartE2EDuration="2m6.52666993s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:23.524108564 +0000 UTC m=+147.830125827" watchObservedRunningTime="2025-11-22 02:56:23.52666993 +0000 UTC m=+147.832687203" Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.539916 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:23 crc kubenswrapper[4952]: E1122 02:56:23.540361 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:24.040342641 +0000 UTC m=+148.346359904 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.585470 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m5hc7" podStartSLOduration=126.585445178 podStartE2EDuration="2m6.585445178s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:23.582844561 +0000 UTC m=+147.888861834" watchObservedRunningTime="2025-11-22 02:56:23.585445178 +0000 UTC m=+147.891462451" Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.642234 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:23 crc kubenswrapper[4952]: E1122 02:56:23.642956 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:24.142913802 +0000 UTC m=+148.448931075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.746306 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:23 crc kubenswrapper[4952]: E1122 02:56:23.747325 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:24.24730276 +0000 UTC m=+148.553320033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.756727 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jgbvn" podStartSLOduration=126.756707432 podStartE2EDuration="2m6.756707432s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:23.620011494 +0000 UTC m=+147.926028767" watchObservedRunningTime="2025-11-22 02:56:23.756707432 +0000 UTC m=+148.062724705" Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.849635 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:23 crc kubenswrapper[4952]: E1122 02:56:23.850047 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:24.350033337 +0000 UTC m=+148.656050610 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.861701 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hlwls"] Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.863017 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlwls" Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.895138 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.930503 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hlwls"] Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.950579 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.950912 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4f424b2-2973-4b6a-99dd-08fd5b237adf-utilities\") pod \"community-operators-hlwls\" (UID: \"a4f424b2-2973-4b6a-99dd-08fd5b237adf\") " pod="openshift-marketplace/community-operators-hlwls" Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.951000 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4f424b2-2973-4b6a-99dd-08fd5b237adf-catalog-content\") pod \"community-operators-hlwls\" (UID: \"a4f424b2-2973-4b6a-99dd-08fd5b237adf\") " pod="openshift-marketplace/community-operators-hlwls" Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.951050 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grbq5\" (UniqueName: \"kubernetes.io/projected/a4f424b2-2973-4b6a-99dd-08fd5b237adf-kube-api-access-grbq5\") pod \"community-operators-hlwls\" (UID: \"a4f424b2-2973-4b6a-99dd-08fd5b237adf\") " pod="openshift-marketplace/community-operators-hlwls" Nov 22 02:56:23 crc kubenswrapper[4952]: E1122 02:56:23.951179 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:24.451156991 +0000 UTC m=+148.757174264 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.986452 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bwg2f"] Nov 22 02:56:23 crc kubenswrapper[4952]: I1122 02:56:23.987869 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwg2f" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.016164 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.042560 4952 patch_prober.go:28] interesting pod/router-default-5444994796-9wz27 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 02:56:24 crc kubenswrapper[4952]: [-]has-synced failed: reason withheld Nov 22 02:56:24 crc kubenswrapper[4952]: [+]process-running ok Nov 22 02:56:24 crc kubenswrapper[4952]: healthz check failed Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.042628 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wz27" podUID="12981135-5b52-464d-8690-e571eb306507" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.052508 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grbq5\" (UniqueName: \"kubernetes.io/projected/a4f424b2-2973-4b6a-99dd-08fd5b237adf-kube-api-access-grbq5\") pod \"community-operators-hlwls\" (UID: \"a4f424b2-2973-4b6a-99dd-08fd5b237adf\") " pod="openshift-marketplace/community-operators-hlwls" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.052609 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4f424b2-2973-4b6a-99dd-08fd5b237adf-utilities\") pod \"community-operators-hlwls\" (UID: \"a4f424b2-2973-4b6a-99dd-08fd5b237adf\") " pod="openshift-marketplace/community-operators-hlwls" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.052674 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4f424b2-2973-4b6a-99dd-08fd5b237adf-catalog-content\") pod \"community-operators-hlwls\" (UID: \"a4f424b2-2973-4b6a-99dd-08fd5b237adf\") " pod="openshift-marketplace/community-operators-hlwls" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.052706 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:24 crc kubenswrapper[4952]: E1122 02:56:24.053065 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:24.553049686 +0000 UTC m=+148.859066959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.053943 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4f424b2-2973-4b6a-99dd-08fd5b237adf-catalog-content\") pod \"community-operators-hlwls\" (UID: \"a4f424b2-2973-4b6a-99dd-08fd5b237adf\") " pod="openshift-marketplace/community-operators-hlwls" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.054004 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4f424b2-2973-4b6a-99dd-08fd5b237adf-utilities\") pod \"community-operators-hlwls\" (UID: \"a4f424b2-2973-4b6a-99dd-08fd5b237adf\") " pod="openshift-marketplace/community-operators-hlwls" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.112589 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grbq5\" (UniqueName: \"kubernetes.io/projected/a4f424b2-2973-4b6a-99dd-08fd5b237adf-kube-api-access-grbq5\") pod \"community-operators-hlwls\" (UID: \"a4f424b2-2973-4b6a-99dd-08fd5b237adf\") " pod="openshift-marketplace/community-operators-hlwls" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.142243 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bwg2f"] Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.154830 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.155131 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2d87343-7102-459c-a231-294939870dc5-utilities\") pod \"certified-operators-bwg2f\" (UID: \"d2d87343-7102-459c-a231-294939870dc5\") " pod="openshift-marketplace/certified-operators-bwg2f" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.155163 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dphj6\" (UniqueName: \"kubernetes.io/projected/d2d87343-7102-459c-a231-294939870dc5-kube-api-access-dphj6\") pod \"certified-operators-bwg2f\" (UID: \"d2d87343-7102-459c-a231-294939870dc5\") " pod="openshift-marketplace/certified-operators-bwg2f" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.155222 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2d87343-7102-459c-a231-294939870dc5-catalog-content\") pod \"certified-operators-bwg2f\" (UID: \"d2d87343-7102-459c-a231-294939870dc5\") " pod="openshift-marketplace/certified-operators-bwg2f" Nov 22 02:56:24 crc kubenswrapper[4952]: E1122 02:56:24.155331 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:24.655312149 +0000 UTC m=+148.961329422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.207901 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8j98h" event={"ID":"a57045cb-1c86-432e-ae6f-2f973ce52596","Type":"ContainerStarted","Data":"c10cd91518435994adfd4a7709fb08d064b276882f1b0b87348e5142658f2aaa"} Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.207963 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8j98h" event={"ID":"a57045cb-1c86-432e-ae6f-2f973ce52596","Type":"ContainerStarted","Data":"cc959c38eafade187eda751dc818213b3428794960b484046df0a308e277731f"} Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.216616 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f95sp"] Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.217813 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f95sp" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.221497 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlwls" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.269733 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2d87343-7102-459c-a231-294939870dc5-utilities\") pod \"certified-operators-bwg2f\" (UID: \"d2d87343-7102-459c-a231-294939870dc5\") " pod="openshift-marketplace/certified-operators-bwg2f" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.269778 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dphj6\" (UniqueName: \"kubernetes.io/projected/d2d87343-7102-459c-a231-294939870dc5-kube-api-access-dphj6\") pod \"certified-operators-bwg2f\" (UID: \"d2d87343-7102-459c-a231-294939870dc5\") " pod="openshift-marketplace/certified-operators-bwg2f" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.269817 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.269842 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2d87343-7102-459c-a231-294939870dc5-catalog-content\") pod \"certified-operators-bwg2f\" (UID: \"d2d87343-7102-459c-a231-294939870dc5\") " pod="openshift-marketplace/certified-operators-bwg2f" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.271011 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2d87343-7102-459c-a231-294939870dc5-catalog-content\") pod \"certified-operators-bwg2f\" (UID: \"d2d87343-7102-459c-a231-294939870dc5\") " pod="openshift-marketplace/certified-operators-bwg2f" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.271284 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2d87343-7102-459c-a231-294939870dc5-utilities\") pod \"certified-operators-bwg2f\" (UID: \"d2d87343-7102-459c-a231-294939870dc5\") " pod="openshift-marketplace/certified-operators-bwg2f" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.271824 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f95sp"] Nov 22 02:56:24 crc kubenswrapper[4952]: E1122 02:56:24.271873 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:24.771860159 +0000 UTC m=+149.077877432 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.278009 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7vg2x" event={"ID":"b46a19f6-7d04-44b1-a2ad-6146c66fb5e2","Type":"ContainerStarted","Data":"fbc124212e2214f7486887158a3b2fa23c7bde091fecf5b987629b4eedcac5f3"} Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.305684 4952 patch_prober.go:28] interesting pod/apiserver-76f77b778f-dfwdz container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 22 02:56:24 crc kubenswrapper[4952]: [+]log ok Nov 22 02:56:24 crc kubenswrapper[4952]: [+]etcd ok Nov 22 02:56:24 crc kubenswrapper[4952]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 22 02:56:24 crc kubenswrapper[4952]: [+]poststarthook/generic-apiserver-start-informers ok Nov 22 02:56:24 crc kubenswrapper[4952]: [+]poststarthook/max-in-flight-filter ok Nov 22 02:56:24 crc kubenswrapper[4952]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 22 02:56:24 crc kubenswrapper[4952]: [+]poststarthook/image.openshift.io-apiserver-caches ok Nov 22 02:56:24 crc kubenswrapper[4952]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Nov 22 02:56:24 crc kubenswrapper[4952]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Nov 22 02:56:24 crc kubenswrapper[4952]: [+]poststarthook/project.openshift.io-projectcache ok Nov 22 02:56:24 crc kubenswrapper[4952]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Nov 22 02:56:24 crc kubenswrapper[4952]: [+]poststarthook/openshift.io-startinformers ok Nov 22 02:56:24 crc kubenswrapper[4952]: [+]poststarthook/openshift.io-restmapperupdater ok Nov 22 02:56:24 crc kubenswrapper[4952]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 22 02:56:24 crc kubenswrapper[4952]: livez check failed Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.305779 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" podUID="b72acca1-8338-4e9c-9fa5-55616766e8a9" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.317648 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fhpzk" event={"ID":"867a3e20-6f2f-4dfe-a378-e6357d6c19e3","Type":"ContainerStarted","Data":"285017f174dad82f4eda079734f91a7bccec71ca124428bfdf5ec5a0ddad4964"} Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.317694 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fhpzk" event={"ID":"867a3e20-6f2f-4dfe-a378-e6357d6c19e3","Type":"ContainerStarted","Data":"7b4629389934c14ac4b8b3bd9a30ebdb13e26a46c2fba2af1c581831239d0c8a"} Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.317897 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-fhpzk" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.355908 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sw6q8" event={"ID":"7f15cd27-e587-4db6-8fcd-b5b2cd559656","Type":"ContainerStarted","Data":"e0c8b7ba2f2a984d58bdfcb299d8780e8abe822452bc189402bad36db8500456"} Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.359328 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8j98h" podStartSLOduration=127.359295982 podStartE2EDuration="2m7.359295982s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:24.350518087 +0000 UTC m=+148.656535350" watchObservedRunningTime="2025-11-22 02:56:24.359295982 +0000 UTC m=+148.665313255" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.368787 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dphj6\" (UniqueName: \"kubernetes.io/projected/d2d87343-7102-459c-a231-294939870dc5-kube-api-access-dphj6\") pod \"certified-operators-bwg2f\" (UID: \"d2d87343-7102-459c-a231-294939870dc5\") " pod="openshift-marketplace/certified-operators-bwg2f" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.371325 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.371699 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.371751 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97wqk\" (UniqueName: \"kubernetes.io/projected/c96d123a-e54d-45b9-aeb1-0083414f67ee-kube-api-access-97wqk\") pod \"community-operators-f95sp\" (UID: \"c96d123a-e54d-45b9-aeb1-0083414f67ee\") " pod="openshift-marketplace/community-operators-f95sp" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.371790 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.371827 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.371870 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c96d123a-e54d-45b9-aeb1-0083414f67ee-utilities\") pod \"community-operators-f95sp\" (UID: \"c96d123a-e54d-45b9-aeb1-0083414f67ee\") " pod="openshift-marketplace/community-operators-f95sp" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.371903 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.371934 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c96d123a-e54d-45b9-aeb1-0083414f67ee-catalog-content\") pod \"community-operators-f95sp\" (UID: \"c96d123a-e54d-45b9-aeb1-0083414f67ee\") " pod="openshift-marketplace/community-operators-f95sp" Nov 22 02:56:24 crc kubenswrapper[4952]: E1122 02:56:24.372026 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:24.87200962 +0000 UTC m=+149.178026893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.373898 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.378318 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.380853 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.382748 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bsnxq"] Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.384040 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bsnxq" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.386157 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.406188 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bsnxq"] Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.410672 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8ngds" event={"ID":"a420a904-6165-4f7a-a29e-3c5549e5cec5","Type":"ContainerStarted","Data":"275a0895a7f89e3b11add78559e4515fc88d9805368b4d2716ab0a8c745afd53"} Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.410730 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8ngds" event={"ID":"a420a904-6165-4f7a-a29e-3c5549e5cec5","Type":"ContainerStarted","Data":"e07720606d9afe8d6ad892500d537e9ab758b3fb516d98f0d655579091a0d1d1"} Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.447644 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.448349 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-t4rf8" event={"ID":"0893cff8-0528-4b35-b1f9-faa91e42e5a5","Type":"ContainerStarted","Data":"65add73c435c907dd1cfdcbed4aaaee8197effcc8c62663312d6c576333cf0ed"} Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.448383 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-t4rf8" event={"ID":"0893cff8-0528-4b35-b1f9-faa91e42e5a5","Type":"ContainerStarted","Data":"4982b07c57d8f8c746a63290d7811017f8a0cb5b15b7812c83333de1ef8e490b"} Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.449644 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fhpzk" podStartSLOduration=8.44962056 podStartE2EDuration="8.44962056s" podCreationTimestamp="2025-11-22 02:56:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:24.447178417 +0000 UTC m=+148.753195700" watchObservedRunningTime="2025-11-22 02:56:24.44962056 +0000 UTC m=+148.755637833" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.461972 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.478446 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97wqk\" (UniqueName: \"kubernetes.io/projected/c96d123a-e54d-45b9-aeb1-0083414f67ee-kube-api-access-97wqk\") pod \"community-operators-f95sp\" (UID: \"c96d123a-e54d-45b9-aeb1-0083414f67ee\") " pod="openshift-marketplace/community-operators-f95sp" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.478903 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.478959 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c96d123a-e54d-45b9-aeb1-0083414f67ee-utilities\") pod \"community-operators-f95sp\" (UID: \"c96d123a-e54d-45b9-aeb1-0083414f67ee\") " pod="openshift-marketplace/community-operators-f95sp" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.479002 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c96d123a-e54d-45b9-aeb1-0083414f67ee-catalog-content\") pod \"community-operators-f95sp\" (UID: \"c96d123a-e54d-45b9-aeb1-0083414f67ee\") " pod="openshift-marketplace/community-operators-f95sp" Nov 22 02:56:24 crc kubenswrapper[4952]: E1122 02:56:24.481137 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:24.981125588 +0000 UTC m=+149.287142861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.481628 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c96d123a-e54d-45b9-aeb1-0083414f67ee-utilities\") pod \"community-operators-f95sp\" (UID: \"c96d123a-e54d-45b9-aeb1-0083414f67ee\") " pod="openshift-marketplace/community-operators-f95sp" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.481990 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c96d123a-e54d-45b9-aeb1-0083414f67ee-catalog-content\") pod \"community-operators-f95sp\" (UID: \"c96d123a-e54d-45b9-aeb1-0083414f67ee\") " pod="openshift-marketplace/community-operators-f95sp" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.491105 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" event={"ID":"7f436461-6865-411d-9c2d-8c5794d1b4ab","Type":"ContainerStarted","Data":"0015824168b1ca54ffde80a63074961e60d12362f1816e71577f225e54e8496f"} Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.568489 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hcvgc" event={"ID":"a4782fdd-4348-4995-af8f-eb6d61183dec","Type":"ContainerStarted","Data":"eb9e0b68e1466965f3e33a7ffafafb3baef94484784c9d311d009aa684a428f4"} Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.581169 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.581423 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67059cf9-4ef8-46a6-9012-fe9a32fbf3bc-catalog-content\") pod \"certified-operators-bsnxq\" (UID: \"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc\") " pod="openshift-marketplace/certified-operators-bsnxq" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.581575 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67059cf9-4ef8-46a6-9012-fe9a32fbf3bc-utilities\") pod \"certified-operators-bsnxq\" (UID: \"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc\") " pod="openshift-marketplace/certified-operators-bsnxq" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.581624 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbxz4\" (UniqueName: \"kubernetes.io/projected/67059cf9-4ef8-46a6-9012-fe9a32fbf3bc-kube-api-access-rbxz4\") pod \"certified-operators-bsnxq\" (UID: \"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc\") " pod="openshift-marketplace/certified-operators-bsnxq" Nov 22 02:56:24 crc kubenswrapper[4952]: E1122 02:56:24.582284 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:25.082269734 +0000 UTC m=+149.388287007 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.586079 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" event={"ID":"85b3681c-313d-40d1-b1f9-c8410c81dc20","Type":"ContainerStarted","Data":"781787ed0e433869f887b25a2f2072bc585198f96de1f27007bbe50af2017cd6"} Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.587037 4952 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9g9km container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.587061 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" podUID="85b3681c-313d-40d1-b1f9-c8410c81dc20" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.596518 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97wqk\" (UniqueName: \"kubernetes.io/projected/c96d123a-e54d-45b9-aeb1-0083414f67ee-kube-api-access-97wqk\") pod \"community-operators-f95sp\" (UID: \"c96d123a-e54d-45b9-aeb1-0083414f67ee\") " pod="openshift-marketplace/community-operators-f95sp" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.615432 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml" event={"ID":"85eed36b-26fa-4c39-a899-23262c4c1043","Type":"ContainerStarted","Data":"dbe1ef6d4ca769c5aac8e9acdbb2548c678ff385625b4c13eca2c51cf8662c03"} Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.616423 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwg2f" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.633693 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2c9gx" event={"ID":"6630a2b1-f78c-4c1e-8511-6bb8dc615362","Type":"ContainerStarted","Data":"74b22270dea05f92b3d2cf488380f62b70ebea3a1e6ffec8586b87fd44a9c833"} Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.637267 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8ngds" podStartSLOduration=127.637248874 podStartE2EDuration="2m7.637248874s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:24.610482658 +0000 UTC m=+148.916499931" watchObservedRunningTime="2025-11-22 02:56:24.637248874 +0000 UTC m=+148.943266147" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.657746 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.697654 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67059cf9-4ef8-46a6-9012-fe9a32fbf3bc-catalog-content\") pod \"certified-operators-bsnxq\" (UID: \"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc\") " pod="openshift-marketplace/certified-operators-bsnxq" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.716907 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.717256 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-t4rf8" podStartSLOduration=127.717234496 podStartE2EDuration="2m7.717234496s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:24.635392487 +0000 UTC m=+148.941409760" watchObservedRunningTime="2025-11-22 02:56:24.717234496 +0000 UTC m=+149.023251769" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.719193 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67059cf9-4ef8-46a6-9012-fe9a32fbf3bc-utilities\") pod \"certified-operators-bsnxq\" (UID: \"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc\") " pod="openshift-marketplace/certified-operators-bsnxq" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.719297 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbxz4\" (UniqueName: \"kubernetes.io/projected/67059cf9-4ef8-46a6-9012-fe9a32fbf3bc-kube-api-access-rbxz4\") pod \"certified-operators-bsnxq\" (UID: \"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc\") " pod="openshift-marketplace/certified-operators-bsnxq" Nov 22 02:56:24 crc kubenswrapper[4952]: E1122 02:56:24.719314 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:25.21929192 +0000 UTC m=+149.525309193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.720388 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" podStartSLOduration=127.720379667 podStartE2EDuration="2m7.720379667s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:24.717726879 +0000 UTC m=+149.023744162" watchObservedRunningTime="2025-11-22 02:56:24.720379667 +0000 UTC m=+149.026396950" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.721292 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67059cf9-4ef8-46a6-9012-fe9a32fbf3bc-catalog-content\") pod \"certified-operators-bsnxq\" (UID: \"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc\") " pod="openshift-marketplace/certified-operators-bsnxq" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.736699 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67059cf9-4ef8-46a6-9012-fe9a32fbf3bc-utilities\") pod \"certified-operators-bsnxq\" (UID: \"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc\") " pod="openshift-marketplace/certified-operators-bsnxq" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.742956 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2c9gx" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.752323 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5gwx5" event={"ID":"9aae38d4-efc5-4f2b-acdf-0d3a607b54a9","Type":"ContainerStarted","Data":"48e2a1780b85bf9ada43db4603a5f2ca4b4e596e13b7ccaa30ea4de9fcee5121"} Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.752647 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5gwx5" event={"ID":"9aae38d4-efc5-4f2b-acdf-0d3a607b54a9","Type":"ContainerStarted","Data":"3bdfda9c0a2ab73c14c4d9f4e166061d55bece96fb4b47f31101ec77a9cc5a3b"} Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.762434 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nbpml" podStartSLOduration=127.762416825 podStartE2EDuration="2m7.762416825s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:24.752632394 +0000 UTC m=+149.058649667" watchObservedRunningTime="2025-11-22 02:56:24.762416825 +0000 UTC m=+149.068434098" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.813324 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-5gwx5" podStartSLOduration=127.813302582 podStartE2EDuration="2m7.813302582s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:24.794010776 +0000 UTC m=+149.100028039" watchObservedRunningTime="2025-11-22 02:56:24.813302582 +0000 UTC m=+149.119319855" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.818703 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbxz4\" (UniqueName: \"kubernetes.io/projected/67059cf9-4ef8-46a6-9012-fe9a32fbf3bc-kube-api-access-rbxz4\") pod \"certified-operators-bsnxq\" (UID: \"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc\") " pod="openshift-marketplace/certified-operators-bsnxq" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.827353 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-94k47" event={"ID":"028b56f2-cfad-4db5-81ab-fa866f42f9c3","Type":"ContainerStarted","Data":"8f1ff188e34be15939e1ecd442e7393e6053206242f16e3a83960d489a25e900"} Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.827980 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:24 crc kubenswrapper[4952]: E1122 02:56:24.829248 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:25.3292277 +0000 UTC m=+149.635244973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.845370 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f95sp" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.861239 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hlwls"] Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.899166 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m2hvn" event={"ID":"c1faf425-0d47-41a5-8727-7f8f5d74f8b9","Type":"ContainerStarted","Data":"d9442af5e81e4cb491c6f463c2e290acf2c8332666aa017c73135b6fb24d2afe"} Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.900450 4952 patch_prober.go:28] interesting pod/downloads-7954f5f757-kkdb8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.900499 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kkdb8" podUID="af11e3ed-3c58-4ad5-9da7-38b9950ff726" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.932437 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:24 crc kubenswrapper[4952]: E1122 02:56:24.940920 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:25.440901115 +0000 UTC m=+149.746918388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:24 crc kubenswrapper[4952]: I1122 02:56:24.950141 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-m2hvn" podStartSLOduration=8.950119191 podStartE2EDuration="8.950119191s" podCreationTimestamp="2025-11-22 02:56:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:24.93523335 +0000 UTC m=+149.241250613" watchObservedRunningTime="2025-11-22 02:56:24.950119191 +0000 UTC m=+149.256136464" Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.016100 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fch9k" Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.022888 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bsnxq" Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.036975 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:25 crc kubenswrapper[4952]: E1122 02:56:25.037101 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:25.537080664 +0000 UTC m=+149.843097937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.037381 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:25 crc kubenswrapper[4952]: E1122 02:56:25.045719 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:25.54553271 +0000 UTC m=+149.851550193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.063782 4952 patch_prober.go:28] interesting pod/router-default-5444994796-9wz27 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 02:56:25 crc kubenswrapper[4952]: [-]has-synced failed: reason withheld Nov 22 02:56:25 crc kubenswrapper[4952]: [+]process-running ok Nov 22 02:56:25 crc kubenswrapper[4952]: healthz check failed Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.063844 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wz27" podUID="12981135-5b52-464d-8690-e571eb306507" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.146309 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:25 crc kubenswrapper[4952]: E1122 02:56:25.146663 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:25.646647834 +0000 UTC m=+149.952665107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.203212 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4vlh5" Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.250245 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:25 crc kubenswrapper[4952]: E1122 02:56:25.250623 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:25.750608902 +0000 UTC m=+150.056626175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.353217 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:25 crc kubenswrapper[4952]: E1122 02:56:25.353494 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:25.85347714 +0000 UTC m=+150.159494413 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.454593 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:25 crc kubenswrapper[4952]: E1122 02:56:25.455023 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:25.955008226 +0000 UTC m=+150.261025499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.556606 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:25 crc kubenswrapper[4952]: E1122 02:56:25.556830 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:26.056798878 +0000 UTC m=+150.362816151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.557150 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:25 crc kubenswrapper[4952]: E1122 02:56:25.557489 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:26.057474495 +0000 UTC m=+150.363491768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.648162 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bwg2f"] Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.661042 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:25 crc kubenswrapper[4952]: E1122 02:56:25.661522 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:26.161504234 +0000 UTC m=+150.467521507 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.766696 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:25 crc kubenswrapper[4952]: E1122 02:56:25.767172 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:26.267155835 +0000 UTC m=+150.573173108 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.869163 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:25 crc kubenswrapper[4952]: E1122 02:56:25.869552 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:26.369521911 +0000 UTC m=+150.675539184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.934596 4952 generic.go:334] "Generic (PLEG): container finished" podID="ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb" containerID="80c62495a18d69e9ac1d4e770632cb5979f0f7250b4f5314558c4e4465783306" exitCode=0 Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.935055 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq" event={"ID":"ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb","Type":"ContainerDied","Data":"80c62495a18d69e9ac1d4e770632cb5979f0f7250b4f5314558c4e4465783306"} Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.946778 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"74edaee1a34bed4bf4b8066c2e9d812465f4295520d8251c249244e0c375270e"} Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.975596 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:25 crc kubenswrapper[4952]: E1122 02:56:25.975911 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:26.475897781 +0000 UTC m=+150.781915054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.992338 4952 generic.go:334] "Generic (PLEG): container finished" podID="a4f424b2-2973-4b6a-99dd-08fd5b237adf" containerID="073777784f2e0cf98c932ae15eafe629efa39c809310dda5d418c9d0bc9ba3c0" exitCode=0 Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.992436 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlwls" event={"ID":"a4f424b2-2973-4b6a-99dd-08fd5b237adf","Type":"ContainerDied","Data":"073777784f2e0cf98c932ae15eafe629efa39c809310dda5d418c9d0bc9ba3c0"} Nov 22 02:56:25 crc kubenswrapper[4952]: I1122 02:56:25.992466 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlwls" event={"ID":"a4f424b2-2973-4b6a-99dd-08fd5b237adf","Type":"ContainerStarted","Data":"448f979b5a1a009f4388d263945ec4f19c03e5ba3055c9dc14242630156a4ee8"} Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.006392 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwg2f" event={"ID":"d2d87343-7102-459c-a231-294939870dc5","Type":"ContainerStarted","Data":"7176f1aee2d9db649f4bc38e52926e570ce0edf6ae8a2d42a0e8871f84725d57"} Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.010153 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c65b5bc0f41fd7fe77a7668a3451a0fe97ac22caff7d5f78b0aa1ae7ce1ee525"} Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.014972 4952 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9g9km container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.015036 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" podUID="85b3681c-313d-40d1-b1f9-c8410c81dc20" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.050616 4952 patch_prober.go:28] interesting pod/router-default-5444994796-9wz27 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 02:56:26 crc kubenswrapper[4952]: [-]has-synced failed: reason withheld Nov 22 02:56:26 crc kubenswrapper[4952]: [+]process-running ok Nov 22 02:56:26 crc kubenswrapper[4952]: healthz check failed Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.050703 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wz27" podUID="12981135-5b52-464d-8690-e571eb306507" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.077527 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:26 crc kubenswrapper[4952]: E1122 02:56:26.078981 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:26.578953955 +0000 UTC m=+150.884971228 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.093427 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f95sp"] Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.149773 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lklxp"] Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.151362 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lklxp" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.160250 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.171663 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lklxp"] Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.183982 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7ql6\" (UniqueName: \"kubernetes.io/projected/fc00999c-0e40-4bca-b54a-2d416d925514-kube-api-access-c7ql6\") pod \"redhat-marketplace-lklxp\" (UID: \"fc00999c-0e40-4bca-b54a-2d416d925514\") " pod="openshift-marketplace/redhat-marketplace-lklxp" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.184031 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc00999c-0e40-4bca-b54a-2d416d925514-catalog-content\") pod \"redhat-marketplace-lklxp\" (UID: \"fc00999c-0e40-4bca-b54a-2d416d925514\") " pod="openshift-marketplace/redhat-marketplace-lklxp" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.184065 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc00999c-0e40-4bca-b54a-2d416d925514-utilities\") pod \"redhat-marketplace-lklxp\" (UID: \"fc00999c-0e40-4bca-b54a-2d416d925514\") " pod="openshift-marketplace/redhat-marketplace-lklxp" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.184203 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:26 crc kubenswrapper[4952]: E1122 02:56:26.184605 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:26.684590225 +0000 UTC m=+150.990607498 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.184651 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bsnxq"] Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.286635 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.286823 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7ql6\" (UniqueName: \"kubernetes.io/projected/fc00999c-0e40-4bca-b54a-2d416d925514-kube-api-access-c7ql6\") pod \"redhat-marketplace-lklxp\" (UID: \"fc00999c-0e40-4bca-b54a-2d416d925514\") " pod="openshift-marketplace/redhat-marketplace-lklxp" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.286856 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc00999c-0e40-4bca-b54a-2d416d925514-catalog-content\") pod \"redhat-marketplace-lklxp\" (UID: \"fc00999c-0e40-4bca-b54a-2d416d925514\") " pod="openshift-marketplace/redhat-marketplace-lklxp" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.286895 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc00999c-0e40-4bca-b54a-2d416d925514-utilities\") pod \"redhat-marketplace-lklxp\" (UID: \"fc00999c-0e40-4bca-b54a-2d416d925514\") " pod="openshift-marketplace/redhat-marketplace-lklxp" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.287328 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc00999c-0e40-4bca-b54a-2d416d925514-utilities\") pod \"redhat-marketplace-lklxp\" (UID: \"fc00999c-0e40-4bca-b54a-2d416d925514\") " pod="openshift-marketplace/redhat-marketplace-lklxp" Nov 22 02:56:26 crc kubenswrapper[4952]: E1122 02:56:26.287409 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:26.787390482 +0000 UTC m=+151.093407755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.288632 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc00999c-0e40-4bca-b54a-2d416d925514-catalog-content\") pod \"redhat-marketplace-lklxp\" (UID: \"fc00999c-0e40-4bca-b54a-2d416d925514\") " pod="openshift-marketplace/redhat-marketplace-lklxp" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.365718 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7ql6\" (UniqueName: \"kubernetes.io/projected/fc00999c-0e40-4bca-b54a-2d416d925514-kube-api-access-c7ql6\") pod \"redhat-marketplace-lklxp\" (UID: \"fc00999c-0e40-4bca-b54a-2d416d925514\") " pod="openshift-marketplace/redhat-marketplace-lklxp" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.375888 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r22df"] Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.376906 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r22df" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.387731 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:26 crc kubenswrapper[4952]: E1122 02:56:26.388131 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:26.888118557 +0000 UTC m=+151.194135830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.463470 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r22df"] Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.489728 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.489954 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68fb4fd4-561b-4208-9370-331e71740744-utilities\") pod \"redhat-marketplace-r22df\" (UID: \"68fb4fd4-561b-4208-9370-331e71740744\") " pod="openshift-marketplace/redhat-marketplace-r22df" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.490025 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68fb4fd4-561b-4208-9370-331e71740744-catalog-content\") pod \"redhat-marketplace-r22df\" (UID: \"68fb4fd4-561b-4208-9370-331e71740744\") " pod="openshift-marketplace/redhat-marketplace-r22df" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.490063 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xnpv\" (UniqueName: \"kubernetes.io/projected/68fb4fd4-561b-4208-9370-331e71740744-kube-api-access-6xnpv\") pod \"redhat-marketplace-r22df\" (UID: \"68fb4fd4-561b-4208-9370-331e71740744\") " pod="openshift-marketplace/redhat-marketplace-r22df" Nov 22 02:56:26 crc kubenswrapper[4952]: E1122 02:56:26.490178 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:26.990155875 +0000 UTC m=+151.296173148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.519081 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lklxp" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.593756 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68fb4fd4-561b-4208-9370-331e71740744-catalog-content\") pod \"redhat-marketplace-r22df\" (UID: \"68fb4fd4-561b-4208-9370-331e71740744\") " pod="openshift-marketplace/redhat-marketplace-r22df" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.594247 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xnpv\" (UniqueName: \"kubernetes.io/projected/68fb4fd4-561b-4208-9370-331e71740744-kube-api-access-6xnpv\") pod \"redhat-marketplace-r22df\" (UID: \"68fb4fd4-561b-4208-9370-331e71740744\") " pod="openshift-marketplace/redhat-marketplace-r22df" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.594292 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68fb4fd4-561b-4208-9370-331e71740744-utilities\") pod \"redhat-marketplace-r22df\" (UID: \"68fb4fd4-561b-4208-9370-331e71740744\") " pod="openshift-marketplace/redhat-marketplace-r22df" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.594324 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:26 crc kubenswrapper[4952]: E1122 02:56:26.594708 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:27.094690087 +0000 UTC m=+151.400707350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.594858 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68fb4fd4-561b-4208-9370-331e71740744-catalog-content\") pod \"redhat-marketplace-r22df\" (UID: \"68fb4fd4-561b-4208-9370-331e71740744\") " pod="openshift-marketplace/redhat-marketplace-r22df" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.595067 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68fb4fd4-561b-4208-9370-331e71740744-utilities\") pod \"redhat-marketplace-r22df\" (UID: \"68fb4fd4-561b-4208-9370-331e71740744\") " pod="openshift-marketplace/redhat-marketplace-r22df" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.648506 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xnpv\" (UniqueName: \"kubernetes.io/projected/68fb4fd4-561b-4208-9370-331e71740744-kube-api-access-6xnpv\") pod \"redhat-marketplace-r22df\" (UID: \"68fb4fd4-561b-4208-9370-331e71740744\") " pod="openshift-marketplace/redhat-marketplace-r22df" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.695717 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:26 crc kubenswrapper[4952]: E1122 02:56:26.696161 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:27.19613447 +0000 UTC m=+151.502151743 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.707856 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r22df" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.798112 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:26 crc kubenswrapper[4952]: E1122 02:56:26.798537 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:27.298522108 +0000 UTC m=+151.604539381 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.899678 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:26 crc kubenswrapper[4952]: E1122 02:56:26.900439 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:27.400417891 +0000 UTC m=+151.706435164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.954993 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kqxwp"] Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.961508 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kqxwp" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.966376 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 22 02:56:26 crc kubenswrapper[4952]: I1122 02:56:26.980225 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kqxwp"] Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.001776 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b700aff5-fe4e-45d2-840c-b4a5390b6b27-utilities\") pod \"redhat-operators-kqxwp\" (UID: \"b700aff5-fe4e-45d2-840c-b4a5390b6b27\") " pod="openshift-marketplace/redhat-operators-kqxwp" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.001862 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w85x\" (UniqueName: \"kubernetes.io/projected/b700aff5-fe4e-45d2-840c-b4a5390b6b27-kube-api-access-9w85x\") pod \"redhat-operators-kqxwp\" (UID: \"b700aff5-fe4e-45d2-840c-b4a5390b6b27\") " pod="openshift-marketplace/redhat-operators-kqxwp" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.003173 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b700aff5-fe4e-45d2-840c-b4a5390b6b27-catalog-content\") pod \"redhat-operators-kqxwp\" (UID: \"b700aff5-fe4e-45d2-840c-b4a5390b6b27\") " pod="openshift-marketplace/redhat-operators-kqxwp" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.003209 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:27 crc kubenswrapper[4952]: E1122 02:56:27.003638 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:27.50362088 +0000 UTC m=+151.809638153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.034082 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.035095 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.036091 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4cd5472a862ffe37255e984b882145e614fb1b467c8c96e635870a1c5aa41aa9"} Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.038693 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.038836 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.045371 4952 generic.go:334] "Generic (PLEG): container finished" podID="67059cf9-4ef8-46a6-9012-fe9a32fbf3bc" containerID="4a2122ab86cff1fc3720f2fec7e2486585d91f259f9a1762855355bd114031a7" exitCode=0 Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.045480 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsnxq" event={"ID":"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc","Type":"ContainerDied","Data":"4a2122ab86cff1fc3720f2fec7e2486585d91f259f9a1762855355bd114031a7"} Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.045507 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsnxq" event={"ID":"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc","Type":"ContainerStarted","Data":"8c5e33a759d0b7a564075144bf9469850bc6812d1d6c32b8df6bc503c600bc2b"} Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.048268 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.072524 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7d795ef213600883e33208485fcd968e8f1937ac6f8ced2438af869d72f85a8b"} Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.072607 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ea34b116c1cbdd13e56c6e6ddfb56963bcb53b69bc0c0e5c2ccec105540315d5"} Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.080497 4952 patch_prober.go:28] interesting pod/router-default-5444994796-9wz27 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 02:56:27 crc kubenswrapper[4952]: [-]has-synced failed: reason withheld Nov 22 02:56:27 crc kubenswrapper[4952]: [+]process-running ok Nov 22 02:56:27 crc kubenswrapper[4952]: healthz check failed Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.080592 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wz27" podUID="12981135-5b52-464d-8690-e571eb306507" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.086260 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"601077ccbfc8e1baba4f70caec4b784967dfee0ff6582913c848e5d5a5836114"} Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.089372 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.093279 4952 generic.go:334] "Generic (PLEG): container finished" podID="c96d123a-e54d-45b9-aeb1-0083414f67ee" containerID="ab66888561f1f4662da6fa9f55629d4a9d6d195c38c8abe59565e42e7e61161c" exitCode=0 Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.093351 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f95sp" event={"ID":"c96d123a-e54d-45b9-aeb1-0083414f67ee","Type":"ContainerDied","Data":"ab66888561f1f4662da6fa9f55629d4a9d6d195c38c8abe59565e42e7e61161c"} Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.093379 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f95sp" event={"ID":"c96d123a-e54d-45b9-aeb1-0083414f67ee","Type":"ContainerStarted","Data":"f9cc49b212c5c448cb52a607488d8760c951d9d6a1116f71192c3d613c1f210f"} Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.107354 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:27 crc kubenswrapper[4952]: E1122 02:56:27.107570 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:27.607491655 +0000 UTC m=+151.913508928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.107688 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w85x\" (UniqueName: \"kubernetes.io/projected/b700aff5-fe4e-45d2-840c-b4a5390b6b27-kube-api-access-9w85x\") pod \"redhat-operators-kqxwp\" (UID: \"b700aff5-fe4e-45d2-840c-b4a5390b6b27\") " pod="openshift-marketplace/redhat-operators-kqxwp" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.107747 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.107775 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b700aff5-fe4e-45d2-840c-b4a5390b6b27-catalog-content\") pod \"redhat-operators-kqxwp\" (UID: \"b700aff5-fe4e-45d2-840c-b4a5390b6b27\") " pod="openshift-marketplace/redhat-operators-kqxwp" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.107805 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90f95915-0e01-49c6-b016-0e776f102ec3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"90f95915-0e01-49c6-b016-0e776f102ec3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.107872 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b700aff5-fe4e-45d2-840c-b4a5390b6b27-utilities\") pod \"redhat-operators-kqxwp\" (UID: \"b700aff5-fe4e-45d2-840c-b4a5390b6b27\") " pod="openshift-marketplace/redhat-operators-kqxwp" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.107897 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90f95915-0e01-49c6-b016-0e776f102ec3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"90f95915-0e01-49c6-b016-0e776f102ec3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 02:56:27 crc kubenswrapper[4952]: E1122 02:56:27.109670 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:27.60965753 +0000 UTC m=+151.915674813 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.110464 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b700aff5-fe4e-45d2-840c-b4a5390b6b27-catalog-content\") pod \"redhat-operators-kqxwp\" (UID: \"b700aff5-fe4e-45d2-840c-b4a5390b6b27\") " pod="openshift-marketplace/redhat-operators-kqxwp" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.110801 4952 generic.go:334] "Generic (PLEG): container finished" podID="d2d87343-7102-459c-a231-294939870dc5" containerID="a8fba462adadb147b58fa82c0ebdd0733853eccc62d65345a835cf957de78597" exitCode=0 Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.112410 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwg2f" event={"ID":"d2d87343-7102-459c-a231-294939870dc5","Type":"ContainerDied","Data":"a8fba462adadb147b58fa82c0ebdd0733853eccc62d65345a835cf957de78597"} Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.112741 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b700aff5-fe4e-45d2-840c-b4a5390b6b27-utilities\") pod \"redhat-operators-kqxwp\" (UID: \"b700aff5-fe4e-45d2-840c-b4a5390b6b27\") " pod="openshift-marketplace/redhat-operators-kqxwp" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.119097 4952 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.140807 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.168622 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-dfwdz" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.178186 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lklxp"] Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.207043 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w85x\" (UniqueName: \"kubernetes.io/projected/b700aff5-fe4e-45d2-840c-b4a5390b6b27-kube-api-access-9w85x\") pod \"redhat-operators-kqxwp\" (UID: \"b700aff5-fe4e-45d2-840c-b4a5390b6b27\") " pod="openshift-marketplace/redhat-operators-kqxwp" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.210747 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.212971 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90f95915-0e01-49c6-b016-0e776f102ec3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"90f95915-0e01-49c6-b016-0e776f102ec3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.213523 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90f95915-0e01-49c6-b016-0e776f102ec3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"90f95915-0e01-49c6-b016-0e776f102ec3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 02:56:27 crc kubenswrapper[4952]: E1122 02:56:27.214427 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:27.714401188 +0000 UTC m=+152.020418511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.219381 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90f95915-0e01-49c6-b016-0e776f102ec3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"90f95915-0e01-49c6-b016-0e776f102ec3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.264791 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90f95915-0e01-49c6-b016-0e776f102ec3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"90f95915-0e01-49c6-b016-0e776f102ec3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.297336 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r22df"] Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.315119 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:27 crc kubenswrapper[4952]: E1122 02:56:27.315667 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:27.815460661 +0000 UTC m=+152.121477934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.337438 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kqxwp" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.345916 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bcfhk"] Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.347844 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bcfhk" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.369430 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.392903 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bcfhk"] Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.416614 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.416953 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9-utilities\") pod \"redhat-operators-bcfhk\" (UID: \"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9\") " pod="openshift-marketplace/redhat-operators-bcfhk" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.416984 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9-catalog-content\") pod \"redhat-operators-bcfhk\" (UID: \"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9\") " pod="openshift-marketplace/redhat-operators-bcfhk" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.417017 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xz7h\" (UniqueName: \"kubernetes.io/projected/6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9-kube-api-access-6xz7h\") pod \"redhat-operators-bcfhk\" (UID: \"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9\") " pod="openshift-marketplace/redhat-operators-bcfhk" Nov 22 02:56:27 crc kubenswrapper[4952]: E1122 02:56:27.417187 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:27.91717038 +0000 UTC m=+152.223187653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:27 crc kubenswrapper[4952]: E1122 02:56:27.519188 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:28.019164157 +0000 UTC m=+152.325181430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.518480 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.520581 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9-utilities\") pod \"redhat-operators-bcfhk\" (UID: \"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9\") " pod="openshift-marketplace/redhat-operators-bcfhk" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.520615 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9-catalog-content\") pod \"redhat-operators-bcfhk\" (UID: \"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9\") " pod="openshift-marketplace/redhat-operators-bcfhk" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.520652 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xz7h\" (UniqueName: \"kubernetes.io/projected/6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9-kube-api-access-6xz7h\") pod \"redhat-operators-bcfhk\" (UID: \"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9\") " pod="openshift-marketplace/redhat-operators-bcfhk" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.521587 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9-utilities\") pod \"redhat-operators-bcfhk\" (UID: \"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9\") " pod="openshift-marketplace/redhat-operators-bcfhk" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.521836 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9-catalog-content\") pod \"redhat-operators-bcfhk\" (UID: \"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9\") " pod="openshift-marketplace/redhat-operators-bcfhk" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.548921 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xz7h\" (UniqueName: \"kubernetes.io/projected/6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9-kube-api-access-6xz7h\") pod \"redhat-operators-bcfhk\" (UID: \"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9\") " pod="openshift-marketplace/redhat-operators-bcfhk" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.560295 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.621666 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb-secret-volume\") pod \"ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb\" (UID: \"ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb\") " Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.621747 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb-config-volume\") pod \"ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb\" (UID: \"ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb\") " Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.621784 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9kjf\" (UniqueName: \"kubernetes.io/projected/ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb-kube-api-access-r9kjf\") pod \"ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb\" (UID: \"ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb\") " Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.621880 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:27 crc kubenswrapper[4952]: E1122 02:56:27.622115 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:28.122095438 +0000 UTC m=+152.428112711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.624121 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb-config-volume" (OuterVolumeSpecName: "config-volume") pod "ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb" (UID: "ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.647022 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb" (UID: "ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.651443 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb-kube-api-access-r9kjf" (OuterVolumeSpecName: "kube-api-access-r9kjf") pod "ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb" (UID: "ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb"). InnerVolumeSpecName "kube-api-access-r9kjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.724792 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.725349 4952 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.725363 4952 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.725376 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9kjf\" (UniqueName: \"kubernetes.io/projected/ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb-kube-api-access-r9kjf\") on node \"crc\" DevicePath \"\"" Nov 22 02:56:27 crc kubenswrapper[4952]: E1122 02:56:27.725787 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:28.225770158 +0000 UTC m=+152.531787431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.768496 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bcfhk" Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.826840 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:27 crc kubenswrapper[4952]: E1122 02:56:27.827052 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:28.327021056 +0000 UTC m=+152.633038329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.827246 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:27 crc kubenswrapper[4952]: E1122 02:56:27.827624 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:28.327617322 +0000 UTC m=+152.633634595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.910333 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kqxwp"] Nov 22 02:56:27 crc kubenswrapper[4952]: I1122 02:56:27.928172 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:27 crc kubenswrapper[4952]: E1122 02:56:27.928685 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:28.428667604 +0000 UTC m=+152.734684877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.030180 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:28 crc kubenswrapper[4952]: E1122 02:56:28.030860 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:28.530848396 +0000 UTC m=+152.836865669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.046481 4952 patch_prober.go:28] interesting pod/router-default-5444994796-9wz27 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 02:56:28 crc kubenswrapper[4952]: [-]has-synced failed: reason withheld Nov 22 02:56:28 crc kubenswrapper[4952]: [+]process-running ok Nov 22 02:56:28 crc kubenswrapper[4952]: healthz check failed Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.046531 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wz27" podUID="12981135-5b52-464d-8690-e571eb306507" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.122027 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bcfhk"] Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.131373 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:28 crc kubenswrapper[4952]: E1122 02:56:28.131861 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:28.631839027 +0000 UTC m=+152.937856300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.146161 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqxwp" event={"ID":"b700aff5-fe4e-45d2-840c-b4a5390b6b27","Type":"ContainerStarted","Data":"b5a42f9ea9fe4d8b638cf5b4706917d141617424acbbf84be2c6ca8b00a2ca8d"} Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.149851 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-94k47" event={"ID":"028b56f2-cfad-4db5-81ab-fa866f42f9c3","Type":"ContainerStarted","Data":"7053a67c98dd6bab476cdc58cd45456567ccde5e8c826892734f7f67d2ccd7ed"} Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.149927 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-94k47" event={"ID":"028b56f2-cfad-4db5-81ab-fa866f42f9c3","Type":"ContainerStarted","Data":"1aaeca20f6565c3aadba03ac773eaccc7aa429bc2dbe78264fcac6bc3c6fd2f4"} Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.159719 4952 generic.go:334] "Generic (PLEG): container finished" podID="fc00999c-0e40-4bca-b54a-2d416d925514" containerID="420df8bb6b77d55981e25ae92186bf0fe762923a4b6ac414439dcdde01673b3d" exitCode=0 Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.159840 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lklxp" event={"ID":"fc00999c-0e40-4bca-b54a-2d416d925514","Type":"ContainerDied","Data":"420df8bb6b77d55981e25ae92186bf0fe762923a4b6ac414439dcdde01673b3d"} Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.159891 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lklxp" event={"ID":"fc00999c-0e40-4bca-b54a-2d416d925514","Type":"ContainerStarted","Data":"81db02489d5dace86350433651e39c4fce6c5080b1e2f4f5206cf2d23c880720"} Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.163805 4952 generic.go:334] "Generic (PLEG): container finished" podID="68fb4fd4-561b-4208-9370-331e71740744" containerID="8fa4539cbbf3e920deb07d842e4216359f522c65e99326f3280f0754c462a75e" exitCode=0 Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.164728 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r22df" event={"ID":"68fb4fd4-561b-4208-9370-331e71740744","Type":"ContainerDied","Data":"8fa4539cbbf3e920deb07d842e4216359f522c65e99326f3280f0754c462a75e"} Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.164762 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r22df" event={"ID":"68fb4fd4-561b-4208-9370-331e71740744","Type":"ContainerStarted","Data":"d0d8767f6128e122c8505e092346c1c791f7489c922b6855452cbf5c909c3a3a"} Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.171218 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq" Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.174386 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq" event={"ID":"ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb","Type":"ContainerDied","Data":"ab86f2eeb807d0e33b741ba76c1534159c4e451e42437a043a26c67084193b34"} Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.174445 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab86f2eeb807d0e33b741ba76c1534159c4e451e42437a043a26c67084193b34" Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.206314 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.238198 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:28 crc kubenswrapper[4952]: E1122 02:56:28.246473 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:28.746456528 +0000 UTC m=+153.052473801 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.340091 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:28 crc kubenswrapper[4952]: E1122 02:56:28.340333 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:28.840307056 +0000 UTC m=+153.146324329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.340595 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:28 crc kubenswrapper[4952]: E1122 02:56:28.341281 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:28.841248349 +0000 UTC m=+153.147265622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.341972 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.342054 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.375822 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.398657 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.399620 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.428354 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.447490 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:28 crc kubenswrapper[4952]: E1122 02:56:28.447801 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:28.947757032 +0000 UTC m=+153.253774305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.448079 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:28 crc kubenswrapper[4952]: E1122 02:56:28.449202 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:28.949172189 +0000 UTC m=+153.255189462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.458133 4952 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.551310 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:28 crc kubenswrapper[4952]: E1122 02:56:28.551784 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:56:29.051763031 +0000 UTC m=+153.357780304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.586046 4952 patch_prober.go:28] interesting pod/downloads-7954f5f757-kkdb8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.586124 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-kkdb8" podUID="af11e3ed-3c58-4ad5-9da7-38b9950ff726" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.588299 4952 patch_prober.go:28] interesting pod/downloads-7954f5f757-kkdb8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.588378 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kkdb8" podUID="af11e3ed-3c58-4ad5-9da7-38b9950ff726" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.653236 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:28 crc kubenswrapper[4952]: E1122 02:56:28.653768 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:56:29.153750668 +0000 UTC m=+153.459767941 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p9bf7" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.728820 4952 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-22T02:56:28.45818022Z","Handler":null,"Name":""} Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.739307 4952 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.739385 4952 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.754076 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.763097 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.858284 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.867632 4952 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.867711 4952 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.918006 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.918070 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.922373 4952 patch_prober.go:28] interesting pod/console-f9d7485db-89rxq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.38:8443/health\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.922476 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-89rxq" podUID="aae47c6e-1d61-40ec-851a-c3e5a6242dcc" containerName="console" probeResult="failure" output="Get \"https://10.217.0.38:8443/health\": dial tcp 10.217.0.38:8443: connect: connection refused" Nov 22 02:56:28 crc kubenswrapper[4952]: I1122 02:56:28.930932 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p9bf7\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:29 crc kubenswrapper[4952]: I1122 02:56:29.038358 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:29 crc kubenswrapper[4952]: I1122 02:56:29.044207 4952 patch_prober.go:28] interesting pod/router-default-5444994796-9wz27 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 02:56:29 crc kubenswrapper[4952]: [-]has-synced failed: reason withheld Nov 22 02:56:29 crc kubenswrapper[4952]: [+]process-running ok Nov 22 02:56:29 crc kubenswrapper[4952]: healthz check failed Nov 22 02:56:29 crc kubenswrapper[4952]: I1122 02:56:29.044291 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9wz27" podUID="12981135-5b52-464d-8690-e571eb306507" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 02:56:29 crc kubenswrapper[4952]: I1122 02:56:29.189861 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:29 crc kubenswrapper[4952]: I1122 02:56:29.192628 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-94k47" event={"ID":"028b56f2-cfad-4db5-81ab-fa866f42f9c3","Type":"ContainerStarted","Data":"1355292b57ae24166d1be16d62f2a89bf4c0835355af28b0e2d7c00bb596feba"} Nov 22 02:56:29 crc kubenswrapper[4952]: I1122 02:56:29.206289 4952 generic.go:334] "Generic (PLEG): container finished" podID="6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9" containerID="0494ea013f98756319bfb590c89a6996927de3c4a7bf5e040bc05f3b645e3f05" exitCode=0 Nov 22 02:56:29 crc kubenswrapper[4952]: I1122 02:56:29.206376 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcfhk" event={"ID":"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9","Type":"ContainerDied","Data":"0494ea013f98756319bfb590c89a6996927de3c4a7bf5e040bc05f3b645e3f05"} Nov 22 02:56:29 crc kubenswrapper[4952]: I1122 02:56:29.206406 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcfhk" event={"ID":"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9","Type":"ContainerStarted","Data":"311a328228755158122e49518b2a5fc531a1482855b663b9f489a36977f0b341"} Nov 22 02:56:29 crc kubenswrapper[4952]: I1122 02:56:29.211318 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"90f95915-0e01-49c6-b016-0e776f102ec3","Type":"ContainerStarted","Data":"694df8664c2507e0614574f3c0d7e372714c84d5cef7c9f176f146b737d3ebfe"} Nov 22 02:56:29 crc kubenswrapper[4952]: I1122 02:56:29.211400 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"90f95915-0e01-49c6-b016-0e776f102ec3","Type":"ContainerStarted","Data":"91c3ef3ce397b13cb957e8bc70a32ab533474e5850c886ece401fb98769c5665"} Nov 22 02:56:29 crc kubenswrapper[4952]: I1122 02:56:29.221614 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-94k47" podStartSLOduration=13.221587477 podStartE2EDuration="13.221587477s" podCreationTimestamp="2025-11-22 02:56:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:29.214151387 +0000 UTC m=+153.520168670" watchObservedRunningTime="2025-11-22 02:56:29.221587477 +0000 UTC m=+153.527604770" Nov 22 02:56:29 crc kubenswrapper[4952]: I1122 02:56:29.224245 4952 generic.go:334] "Generic (PLEG): container finished" podID="b700aff5-fe4e-45d2-840c-b4a5390b6b27" containerID="1244203c72d010abe4876ff667332ceb95e53526dda7382c1a815fa87517e2ba" exitCode=0 Nov 22 02:56:29 crc kubenswrapper[4952]: I1122 02:56:29.225691 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqxwp" event={"ID":"b700aff5-fe4e-45d2-840c-b4a5390b6b27","Type":"ContainerDied","Data":"1244203c72d010abe4876ff667332ceb95e53526dda7382c1a815fa87517e2ba"} Nov 22 02:56:29 crc kubenswrapper[4952]: I1122 02:56:29.237529 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4xhjg" Nov 22 02:56:29 crc kubenswrapper[4952]: I1122 02:56:29.270815 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.270788879 podStartE2EDuration="3.270788879s" podCreationTimestamp="2025-11-22 02:56:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:29.269944128 +0000 UTC m=+153.575961391" watchObservedRunningTime="2025-11-22 02:56:29.270788879 +0000 UTC m=+153.576806152" Nov 22 02:56:29 crc kubenswrapper[4952]: I1122 02:56:29.421764 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" Nov 22 02:56:29 crc kubenswrapper[4952]: I1122 02:56:29.749516 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p9bf7"] Nov 22 02:56:29 crc kubenswrapper[4952]: W1122 02:56:29.834875 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod073b4a27_3e98_4d0d_a2b7_62a89a434907.slice/crio-e5d8ac7c529cf0bc4c8d5be6ce3ab26e5ec207e8493d63769d79730a1b686cb1 WatchSource:0}: Error finding container e5d8ac7c529cf0bc4c8d5be6ce3ab26e5ec207e8493d63769d79730a1b686cb1: Status 404 returned error can't find the container with id e5d8ac7c529cf0bc4c8d5be6ce3ab26e5ec207e8493d63769d79730a1b686cb1 Nov 22 02:56:30 crc kubenswrapper[4952]: I1122 02:56:30.077664 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:30 crc kubenswrapper[4952]: I1122 02:56:30.093307 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-9wz27" Nov 22 02:56:30 crc kubenswrapper[4952]: I1122 02:56:30.261150 4952 generic.go:334] "Generic (PLEG): container finished" podID="90f95915-0e01-49c6-b016-0e776f102ec3" containerID="694df8664c2507e0614574f3c0d7e372714c84d5cef7c9f176f146b737d3ebfe" exitCode=0 Nov 22 02:56:30 crc kubenswrapper[4952]: I1122 02:56:30.261901 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"90f95915-0e01-49c6-b016-0e776f102ec3","Type":"ContainerDied","Data":"694df8664c2507e0614574f3c0d7e372714c84d5cef7c9f176f146b737d3ebfe"} Nov 22 02:56:30 crc kubenswrapper[4952]: I1122 02:56:30.288783 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" event={"ID":"073b4a27-3e98-4d0d-a2b7-62a89a434907","Type":"ContainerStarted","Data":"0977207663e74354579e8bb1ed76de546ca01a79756275e3ae66ba31b92d4b13"} Nov 22 02:56:30 crc kubenswrapper[4952]: I1122 02:56:30.288893 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" event={"ID":"073b4a27-3e98-4d0d-a2b7-62a89a434907","Type":"ContainerStarted","Data":"e5d8ac7c529cf0bc4c8d5be6ce3ab26e5ec207e8493d63769d79730a1b686cb1"} Nov 22 02:56:30 crc kubenswrapper[4952]: I1122 02:56:30.310969 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" podStartSLOduration=133.310947087 podStartE2EDuration="2m13.310947087s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:30.308955387 +0000 UTC m=+154.614972680" watchObservedRunningTime="2025-11-22 02:56:30.310947087 +0000 UTC m=+154.616964360" Nov 22 02:56:30 crc kubenswrapper[4952]: I1122 02:56:30.562114 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 22 02:56:31 crc kubenswrapper[4952]: I1122 02:56:31.298710 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:31 crc kubenswrapper[4952]: I1122 02:56:31.686357 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 02:56:31 crc kubenswrapper[4952]: I1122 02:56:31.738294 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90f95915-0e01-49c6-b016-0e776f102ec3-kube-api-access\") pod \"90f95915-0e01-49c6-b016-0e776f102ec3\" (UID: \"90f95915-0e01-49c6-b016-0e776f102ec3\") " Nov 22 02:56:31 crc kubenswrapper[4952]: I1122 02:56:31.738658 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90f95915-0e01-49c6-b016-0e776f102ec3-kubelet-dir\") pod \"90f95915-0e01-49c6-b016-0e776f102ec3\" (UID: \"90f95915-0e01-49c6-b016-0e776f102ec3\") " Nov 22 02:56:31 crc kubenswrapper[4952]: I1122 02:56:31.739511 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90f95915-0e01-49c6-b016-0e776f102ec3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "90f95915-0e01-49c6-b016-0e776f102ec3" (UID: "90f95915-0e01-49c6-b016-0e776f102ec3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 02:56:31 crc kubenswrapper[4952]: I1122 02:56:31.750759 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90f95915-0e01-49c6-b016-0e776f102ec3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "90f95915-0e01-49c6-b016-0e776f102ec3" (UID: "90f95915-0e01-49c6-b016-0e776f102ec3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:56:31 crc kubenswrapper[4952]: I1122 02:56:31.852006 4952 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90f95915-0e01-49c6-b016-0e776f102ec3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 22 02:56:31 crc kubenswrapper[4952]: I1122 02:56:31.852047 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90f95915-0e01-49c6-b016-0e776f102ec3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 02:56:31 crc kubenswrapper[4952]: I1122 02:56:31.874789 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 22 02:56:31 crc kubenswrapper[4952]: E1122 02:56:31.875176 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb" containerName="collect-profiles" Nov 22 02:56:31 crc kubenswrapper[4952]: I1122 02:56:31.875198 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb" containerName="collect-profiles" Nov 22 02:56:31 crc kubenswrapper[4952]: E1122 02:56:31.875228 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90f95915-0e01-49c6-b016-0e776f102ec3" containerName="pruner" Nov 22 02:56:31 crc kubenswrapper[4952]: I1122 02:56:31.875237 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="90f95915-0e01-49c6-b016-0e776f102ec3" containerName="pruner" Nov 22 02:56:31 crc kubenswrapper[4952]: I1122 02:56:31.877294 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="90f95915-0e01-49c6-b016-0e776f102ec3" containerName="pruner" Nov 22 02:56:31 crc kubenswrapper[4952]: I1122 02:56:31.877360 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb" containerName="collect-profiles" Nov 22 02:56:31 crc kubenswrapper[4952]: I1122 02:56:31.878156 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 02:56:31 crc kubenswrapper[4952]: I1122 02:56:31.893976 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 22 02:56:31 crc kubenswrapper[4952]: I1122 02:56:31.900396 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 22 02:56:31 crc kubenswrapper[4952]: I1122 02:56:31.900731 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 22 02:56:31 crc kubenswrapper[4952]: I1122 02:56:31.953938 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ac254f5-32f7-4cfd-a411-422a7398dc15-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7ac254f5-32f7-4cfd-a411-422a7398dc15\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 02:56:31 crc kubenswrapper[4952]: I1122 02:56:31.954042 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ac254f5-32f7-4cfd-a411-422a7398dc15-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7ac254f5-32f7-4cfd-a411-422a7398dc15\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 02:56:32 crc kubenswrapper[4952]: I1122 02:56:32.055264 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ac254f5-32f7-4cfd-a411-422a7398dc15-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7ac254f5-32f7-4cfd-a411-422a7398dc15\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 02:56:32 crc kubenswrapper[4952]: I1122 02:56:32.055354 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ac254f5-32f7-4cfd-a411-422a7398dc15-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7ac254f5-32f7-4cfd-a411-422a7398dc15\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 02:56:32 crc kubenswrapper[4952]: I1122 02:56:32.055856 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ac254f5-32f7-4cfd-a411-422a7398dc15-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7ac254f5-32f7-4cfd-a411-422a7398dc15\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 02:56:32 crc kubenswrapper[4952]: I1122 02:56:32.122168 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ac254f5-32f7-4cfd-a411-422a7398dc15-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7ac254f5-32f7-4cfd-a411-422a7398dc15\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 02:56:32 crc kubenswrapper[4952]: I1122 02:56:32.236398 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 02:56:32 crc kubenswrapper[4952]: I1122 02:56:32.309331 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"90f95915-0e01-49c6-b016-0e776f102ec3","Type":"ContainerDied","Data":"91c3ef3ce397b13cb957e8bc70a32ab533474e5850c886ece401fb98769c5665"} Nov 22 02:56:32 crc kubenswrapper[4952]: I1122 02:56:32.309404 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91c3ef3ce397b13cb957e8bc70a32ab533474e5850c886ece401fb98769c5665" Nov 22 02:56:32 crc kubenswrapper[4952]: I1122 02:56:32.309576 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 02:56:32 crc kubenswrapper[4952]: I1122 02:56:32.522616 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 22 02:56:33 crc kubenswrapper[4952]: I1122 02:56:33.358101 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7ac254f5-32f7-4cfd-a411-422a7398dc15","Type":"ContainerStarted","Data":"ad908deefc01c24ea0de558bad6f0061b254a194651bb4d58484569b18bc4a06"} Nov 22 02:56:34 crc kubenswrapper[4952]: I1122 02:56:34.380263 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7ac254f5-32f7-4cfd-a411-422a7398dc15","Type":"ContainerStarted","Data":"4c8e4240a1b923f34896142b4735b81b50e37692a88b3a3ebaa499c09646ea61"} Nov 22 02:56:34 crc kubenswrapper[4952]: I1122 02:56:34.406561 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.40651416 podStartE2EDuration="3.40651416s" podCreationTimestamp="2025-11-22 02:56:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:34.398510505 +0000 UTC m=+158.704527778" watchObservedRunningTime="2025-11-22 02:56:34.40651416 +0000 UTC m=+158.712531433" Nov 22 02:56:34 crc kubenswrapper[4952]: I1122 02:56:34.476867 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fhpzk" Nov 22 02:56:35 crc kubenswrapper[4952]: I1122 02:56:35.409819 4952 generic.go:334] "Generic (PLEG): container finished" podID="7ac254f5-32f7-4cfd-a411-422a7398dc15" containerID="4c8e4240a1b923f34896142b4735b81b50e37692a88b3a3ebaa499c09646ea61" exitCode=0 Nov 22 02:56:35 crc kubenswrapper[4952]: I1122 02:56:35.409880 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7ac254f5-32f7-4cfd-a411-422a7398dc15","Type":"ContainerDied","Data":"4c8e4240a1b923f34896142b4735b81b50e37692a88b3a3ebaa499c09646ea61"} Nov 22 02:56:36 crc kubenswrapper[4952]: I1122 02:56:36.776159 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 02:56:36 crc kubenswrapper[4952]: I1122 02:56:36.861632 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ac254f5-32f7-4cfd-a411-422a7398dc15-kubelet-dir\") pod \"7ac254f5-32f7-4cfd-a411-422a7398dc15\" (UID: \"7ac254f5-32f7-4cfd-a411-422a7398dc15\") " Nov 22 02:56:36 crc kubenswrapper[4952]: I1122 02:56:36.862204 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ac254f5-32f7-4cfd-a411-422a7398dc15-kube-api-access\") pod \"7ac254f5-32f7-4cfd-a411-422a7398dc15\" (UID: \"7ac254f5-32f7-4cfd-a411-422a7398dc15\") " Nov 22 02:56:36 crc kubenswrapper[4952]: I1122 02:56:36.861756 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ac254f5-32f7-4cfd-a411-422a7398dc15-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7ac254f5-32f7-4cfd-a411-422a7398dc15" (UID: "7ac254f5-32f7-4cfd-a411-422a7398dc15"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 02:56:36 crc kubenswrapper[4952]: I1122 02:56:36.862714 4952 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ac254f5-32f7-4cfd-a411-422a7398dc15-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 22 02:56:36 crc kubenswrapper[4952]: I1122 02:56:36.869185 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ac254f5-32f7-4cfd-a411-422a7398dc15-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7ac254f5-32f7-4cfd-a411-422a7398dc15" (UID: "7ac254f5-32f7-4cfd-a411-422a7398dc15"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:56:36 crc kubenswrapper[4952]: I1122 02:56:36.964142 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ac254f5-32f7-4cfd-a411-422a7398dc15-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 02:56:37 crc kubenswrapper[4952]: I1122 02:56:37.449387 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7ac254f5-32f7-4cfd-a411-422a7398dc15","Type":"ContainerDied","Data":"ad908deefc01c24ea0de558bad6f0061b254a194651bb4d58484569b18bc4a06"} Nov 22 02:56:37 crc kubenswrapper[4952]: I1122 02:56:37.449442 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad908deefc01c24ea0de558bad6f0061b254a194651bb4d58484569b18bc4a06" Nov 22 02:56:37 crc kubenswrapper[4952]: I1122 02:56:37.449451 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 02:56:38 crc kubenswrapper[4952]: I1122 02:56:38.586561 4952 patch_prober.go:28] interesting pod/downloads-7954f5f757-kkdb8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Nov 22 02:56:38 crc kubenswrapper[4952]: I1122 02:56:38.586701 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-kkdb8" podUID="af11e3ed-3c58-4ad5-9da7-38b9950ff726" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Nov 22 02:56:38 crc kubenswrapper[4952]: I1122 02:56:38.587380 4952 patch_prober.go:28] interesting pod/downloads-7954f5f757-kkdb8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Nov 22 02:56:38 crc kubenswrapper[4952]: I1122 02:56:38.587456 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kkdb8" podUID="af11e3ed-3c58-4ad5-9da7-38b9950ff726" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Nov 22 02:56:38 crc kubenswrapper[4952]: I1122 02:56:38.922218 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:38 crc kubenswrapper[4952]: I1122 02:56:38.926026 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-89rxq" Nov 22 02:56:40 crc kubenswrapper[4952]: I1122 02:56:40.842678 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs\") pod \"network-metrics-daemon-gkngm\" (UID: \"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\") " pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:56:40 crc kubenswrapper[4952]: I1122 02:56:40.851685 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc-metrics-certs\") pod \"network-metrics-daemon-gkngm\" (UID: \"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc\") " pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:56:40 crc kubenswrapper[4952]: I1122 02:56:40.870029 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gkngm" Nov 22 02:56:48 crc kubenswrapper[4952]: I1122 02:56:48.585929 4952 patch_prober.go:28] interesting pod/downloads-7954f5f757-kkdb8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Nov 22 02:56:48 crc kubenswrapper[4952]: I1122 02:56:48.587970 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kkdb8" podUID="af11e3ed-3c58-4ad5-9da7-38b9950ff726" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Nov 22 02:56:48 crc kubenswrapper[4952]: I1122 02:56:48.586096 4952 patch_prober.go:28] interesting pod/downloads-7954f5f757-kkdb8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Nov 22 02:56:48 crc kubenswrapper[4952]: I1122 02:56:48.588503 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-kkdb8" podUID="af11e3ed-3c58-4ad5-9da7-38b9950ff726" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Nov 22 02:56:48 crc kubenswrapper[4952]: I1122 02:56:48.588694 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-kkdb8" Nov 22 02:56:48 crc kubenswrapper[4952]: I1122 02:56:48.589788 4952 patch_prober.go:28] interesting pod/downloads-7954f5f757-kkdb8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Nov 22 02:56:48 crc kubenswrapper[4952]: I1122 02:56:48.589881 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kkdb8" podUID="af11e3ed-3c58-4ad5-9da7-38b9950ff726" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Nov 22 02:56:48 crc kubenswrapper[4952]: I1122 02:56:48.590099 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"a4ef57440a012753d651a16d44204deffae69a5c7ce4c343117d8a9518188756"} pod="openshift-console/downloads-7954f5f757-kkdb8" containerMessage="Container download-server failed liveness probe, will be restarted" Nov 22 02:56:48 crc kubenswrapper[4952]: I1122 02:56:48.590349 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-kkdb8" podUID="af11e3ed-3c58-4ad5-9da7-38b9950ff726" containerName="download-server" containerID="cri-o://a4ef57440a012753d651a16d44204deffae69a5c7ce4c343117d8a9518188756" gracePeriod=2 Nov 22 02:56:49 crc kubenswrapper[4952]: I1122 02:56:49.195300 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 02:56:51 crc kubenswrapper[4952]: I1122 02:56:51.553591 4952 generic.go:334] "Generic (PLEG): container finished" podID="af11e3ed-3c58-4ad5-9da7-38b9950ff726" containerID="a4ef57440a012753d651a16d44204deffae69a5c7ce4c343117d8a9518188756" exitCode=0 Nov 22 02:56:51 crc kubenswrapper[4952]: I1122 02:56:51.553630 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-kkdb8" event={"ID":"af11e3ed-3c58-4ad5-9da7-38b9950ff726","Type":"ContainerDied","Data":"a4ef57440a012753d651a16d44204deffae69a5c7ce4c343117d8a9518188756"} Nov 22 02:56:58 crc kubenswrapper[4952]: I1122 02:56:58.342633 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 02:56:58 crc kubenswrapper[4952]: I1122 02:56:58.343490 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 02:56:58 crc kubenswrapper[4952]: I1122 02:56:58.585935 4952 patch_prober.go:28] interesting pod/downloads-7954f5f757-kkdb8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Nov 22 02:56:58 crc kubenswrapper[4952]: I1122 02:56:58.586052 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kkdb8" podUID="af11e3ed-3c58-4ad5-9da7-38b9950ff726" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Nov 22 02:56:58 crc kubenswrapper[4952]: I1122 02:56:58.964778 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sw6q8" Nov 22 02:57:04 crc kubenswrapper[4952]: I1122 02:57:04.588322 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:57:04 crc kubenswrapper[4952]: E1122 02:57:04.788447 4952 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 22 02:57:04 crc kubenswrapper[4952]: E1122 02:57:04.788914 4952 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-97wqk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-f95sp_openshift-marketplace(c96d123a-e54d-45b9-aeb1-0083414f67ee): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 02:57:04 crc kubenswrapper[4952]: E1122 02:57:04.790241 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-f95sp" podUID="c96d123a-e54d-45b9-aeb1-0083414f67ee" Nov 22 02:57:08 crc kubenswrapper[4952]: E1122 02:57:08.098020 4952 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 22 02:57:08 crc kubenswrapper[4952]: E1122 02:57:08.098890 4952 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9w85x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-kqxwp_openshift-marketplace(b700aff5-fe4e-45d2-840c-b4a5390b6b27): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 02:57:08 crc kubenswrapper[4952]: E1122 02:57:08.100242 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-kqxwp" podUID="b700aff5-fe4e-45d2-840c-b4a5390b6b27" Nov 22 02:57:08 crc kubenswrapper[4952]: I1122 02:57:08.587796 4952 patch_prober.go:28] interesting pod/downloads-7954f5f757-kkdb8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Nov 22 02:57:08 crc kubenswrapper[4952]: I1122 02:57:08.588286 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kkdb8" podUID="af11e3ed-3c58-4ad5-9da7-38b9950ff726" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Nov 22 02:57:09 crc kubenswrapper[4952]: E1122 02:57:09.587890 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-kqxwp" podUID="b700aff5-fe4e-45d2-840c-b4a5390b6b27" Nov 22 02:57:09 crc kubenswrapper[4952]: E1122 02:57:09.674203 4952 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 22 02:57:09 crc kubenswrapper[4952]: E1122 02:57:09.674423 4952 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dphj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bwg2f_openshift-marketplace(d2d87343-7102-459c-a231-294939870dc5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 02:57:09 crc kubenswrapper[4952]: E1122 02:57:09.675637 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bwg2f" podUID="d2d87343-7102-459c-a231-294939870dc5" Nov 22 02:57:09 crc kubenswrapper[4952]: E1122 02:57:09.688864 4952 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 22 02:57:09 crc kubenswrapper[4952]: E1122 02:57:09.689075 4952 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6xz7h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-bcfhk_openshift-marketplace(6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 02:57:09 crc kubenswrapper[4952]: E1122 02:57:09.690261 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-bcfhk" podUID="6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9" Nov 22 02:57:09 crc kubenswrapper[4952]: E1122 02:57:09.744073 4952 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 22 02:57:09 crc kubenswrapper[4952]: E1122 02:57:09.744319 4952 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grbq5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hlwls_openshift-marketplace(a4f424b2-2973-4b6a-99dd-08fd5b237adf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 02:57:09 crc kubenswrapper[4952]: E1122 02:57:09.745612 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hlwls" podUID="a4f424b2-2973-4b6a-99dd-08fd5b237adf" Nov 22 02:57:10 crc kubenswrapper[4952]: E1122 02:57:10.901008 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bwg2f" podUID="d2d87343-7102-459c-a231-294939870dc5" Nov 22 02:57:10 crc kubenswrapper[4952]: E1122 02:57:10.901044 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-bcfhk" podUID="6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9" Nov 22 02:57:10 crc kubenswrapper[4952]: E1122 02:57:10.993226 4952 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 22 02:57:10 crc kubenswrapper[4952]: E1122 02:57:10.993484 4952 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c7ql6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-lklxp_openshift-marketplace(fc00999c-0e40-4bca-b54a-2d416d925514): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 02:57:10 crc kubenswrapper[4952]: E1122 02:57:10.995419 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-lklxp" podUID="fc00999c-0e40-4bca-b54a-2d416d925514" Nov 22 02:57:11 crc kubenswrapper[4952]: E1122 02:57:11.026103 4952 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 22 02:57:11 crc kubenswrapper[4952]: E1122 02:57:11.026654 4952 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rbxz4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bsnxq_openshift-marketplace(67059cf9-4ef8-46a6-9012-fe9a32fbf3bc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 02:57:11 crc kubenswrapper[4952]: E1122 02:57:11.027534 4952 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 22 02:57:11 crc kubenswrapper[4952]: E1122 02:57:11.027729 4952 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6xnpv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-r22df_openshift-marketplace(68fb4fd4-561b-4208-9370-331e71740744): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 02:57:11 crc kubenswrapper[4952]: E1122 02:57:11.027783 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bsnxq" podUID="67059cf9-4ef8-46a6-9012-fe9a32fbf3bc" Nov 22 02:57:11 crc kubenswrapper[4952]: E1122 02:57:11.028940 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-r22df" podUID="68fb4fd4-561b-4208-9370-331e71740744" Nov 22 02:57:11 crc kubenswrapper[4952]: W1122 02:57:11.196202 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1f8fbba_f533_4cf0_bf11_8b8cacaa35dc.slice/crio-eb834c3068754f72f7ef22f8199ba1be5935b162a036fa4a6793de40e42fe242 WatchSource:0}: Error finding container eb834c3068754f72f7ef22f8199ba1be5935b162a036fa4a6793de40e42fe242: Status 404 returned error can't find the container with id eb834c3068754f72f7ef22f8199ba1be5935b162a036fa4a6793de40e42fe242 Nov 22 02:57:11 crc kubenswrapper[4952]: I1122 02:57:11.199344 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gkngm"] Nov 22 02:57:11 crc kubenswrapper[4952]: I1122 02:57:11.696647 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-kkdb8" event={"ID":"af11e3ed-3c58-4ad5-9da7-38b9950ff726","Type":"ContainerStarted","Data":"726a581bb1e899807834980b6498fe464782bc05312f69268117cb1e13a11a54"} Nov 22 02:57:11 crc kubenswrapper[4952]: I1122 02:57:11.697033 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-kkdb8" Nov 22 02:57:11 crc kubenswrapper[4952]: I1122 02:57:11.697371 4952 patch_prober.go:28] interesting pod/downloads-7954f5f757-kkdb8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Nov 22 02:57:11 crc kubenswrapper[4952]: I1122 02:57:11.697657 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kkdb8" podUID="af11e3ed-3c58-4ad5-9da7-38b9950ff726" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Nov 22 02:57:11 crc kubenswrapper[4952]: I1122 02:57:11.699772 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gkngm" event={"ID":"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc","Type":"ContainerStarted","Data":"533f756fa2d197a71333a437276ff2f7bc5b29fe54971010711879631ee52221"} Nov 22 02:57:11 crc kubenswrapper[4952]: I1122 02:57:11.699861 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gkngm" event={"ID":"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc","Type":"ContainerStarted","Data":"eb834c3068754f72f7ef22f8199ba1be5935b162a036fa4a6793de40e42fe242"} Nov 22 02:57:11 crc kubenswrapper[4952]: E1122 02:57:11.702417 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-r22df" podUID="68fb4fd4-561b-4208-9370-331e71740744" Nov 22 02:57:11 crc kubenswrapper[4952]: E1122 02:57:11.702498 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-lklxp" podUID="fc00999c-0e40-4bca-b54a-2d416d925514" Nov 22 02:57:12 crc kubenswrapper[4952]: I1122 02:57:12.706707 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gkngm" event={"ID":"c1f8fbba-f533-4cf0-bf11-8b8cacaa35dc","Type":"ContainerStarted","Data":"1339ee87e93fb22b372777acead1e658418d8514dcddb5583bcbe65f89b7953a"} Nov 22 02:57:12 crc kubenswrapper[4952]: I1122 02:57:12.707438 4952 patch_prober.go:28] interesting pod/downloads-7954f5f757-kkdb8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Nov 22 02:57:12 crc kubenswrapper[4952]: I1122 02:57:12.707501 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kkdb8" podUID="af11e3ed-3c58-4ad5-9da7-38b9950ff726" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Nov 22 02:57:12 crc kubenswrapper[4952]: I1122 02:57:12.726504 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gkngm" podStartSLOduration=175.726476851 podStartE2EDuration="2m55.726476851s" podCreationTimestamp="2025-11-22 02:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:57:12.720594609 +0000 UTC m=+197.026611882" watchObservedRunningTime="2025-11-22 02:57:12.726476851 +0000 UTC m=+197.032494184" Nov 22 02:57:18 crc kubenswrapper[4952]: I1122 02:57:18.586834 4952 patch_prober.go:28] interesting pod/downloads-7954f5f757-kkdb8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Nov 22 02:57:18 crc kubenswrapper[4952]: I1122 02:57:18.587278 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-kkdb8" podUID="af11e3ed-3c58-4ad5-9da7-38b9950ff726" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Nov 22 02:57:18 crc kubenswrapper[4952]: I1122 02:57:18.587092 4952 patch_prober.go:28] interesting pod/downloads-7954f5f757-kkdb8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Nov 22 02:57:18 crc kubenswrapper[4952]: I1122 02:57:18.587381 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kkdb8" podUID="af11e3ed-3c58-4ad5-9da7-38b9950ff726" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Nov 22 02:57:21 crc kubenswrapper[4952]: I1122 02:57:21.771910 4952 generic.go:334] "Generic (PLEG): container finished" podID="c96d123a-e54d-45b9-aeb1-0083414f67ee" containerID="bbcec428faeaafeb6d4f4a978bda33515a91b6571155eb84b6b98033727cde85" exitCode=0 Nov 22 02:57:21 crc kubenswrapper[4952]: I1122 02:57:21.772082 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f95sp" event={"ID":"c96d123a-e54d-45b9-aeb1-0083414f67ee","Type":"ContainerDied","Data":"bbcec428faeaafeb6d4f4a978bda33515a91b6571155eb84b6b98033727cde85"} Nov 22 02:57:23 crc kubenswrapper[4952]: I1122 02:57:23.787165 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f95sp" event={"ID":"c96d123a-e54d-45b9-aeb1-0083414f67ee","Type":"ContainerStarted","Data":"6d51abb6d833ae5b917fbe422fddc13a5ae76fb625c8f73a33885ed37531d7e3"} Nov 22 02:57:23 crc kubenswrapper[4952]: I1122 02:57:23.789636 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqxwp" event={"ID":"b700aff5-fe4e-45d2-840c-b4a5390b6b27","Type":"ContainerStarted","Data":"249b13e6661ca1214f296b17241161a817f535745239117452f46db059e4f297"} Nov 22 02:57:23 crc kubenswrapper[4952]: I1122 02:57:23.793069 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlwls" event={"ID":"a4f424b2-2973-4b6a-99dd-08fd5b237adf","Type":"ContainerStarted","Data":"39939ea9e11dea1d07d0fc9afcdb09cff2ea20841f0a0afff65fbeb6ba45eb50"} Nov 22 02:57:23 crc kubenswrapper[4952]: I1122 02:57:23.815839 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f95sp" podStartSLOduration=4.616764278 podStartE2EDuration="59.815807989s" podCreationTimestamp="2025-11-22 02:56:24 +0000 UTC" firstStartedPulling="2025-11-22 02:56:28.182351403 +0000 UTC m=+152.488368676" lastFinishedPulling="2025-11-22 02:57:23.381395104 +0000 UTC m=+207.687412387" observedRunningTime="2025-11-22 02:57:23.810989049 +0000 UTC m=+208.117006322" watchObservedRunningTime="2025-11-22 02:57:23.815807989 +0000 UTC m=+208.121825272" Nov 22 02:57:24 crc kubenswrapper[4952]: I1122 02:57:24.802185 4952 generic.go:334] "Generic (PLEG): container finished" podID="a4f424b2-2973-4b6a-99dd-08fd5b237adf" containerID="39939ea9e11dea1d07d0fc9afcdb09cff2ea20841f0a0afff65fbeb6ba45eb50" exitCode=0 Nov 22 02:57:24 crc kubenswrapper[4952]: I1122 02:57:24.802275 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlwls" event={"ID":"a4f424b2-2973-4b6a-99dd-08fd5b237adf","Type":"ContainerDied","Data":"39939ea9e11dea1d07d0fc9afcdb09cff2ea20841f0a0afff65fbeb6ba45eb50"} Nov 22 02:57:24 crc kubenswrapper[4952]: I1122 02:57:24.805233 4952 generic.go:334] "Generic (PLEG): container finished" podID="b700aff5-fe4e-45d2-840c-b4a5390b6b27" containerID="249b13e6661ca1214f296b17241161a817f535745239117452f46db059e4f297" exitCode=0 Nov 22 02:57:24 crc kubenswrapper[4952]: I1122 02:57:24.805335 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqxwp" event={"ID":"b700aff5-fe4e-45d2-840c-b4a5390b6b27","Type":"ContainerDied","Data":"249b13e6661ca1214f296b17241161a817f535745239117452f46db059e4f297"} Nov 22 02:57:24 crc kubenswrapper[4952]: I1122 02:57:24.810381 4952 generic.go:334] "Generic (PLEG): container finished" podID="67059cf9-4ef8-46a6-9012-fe9a32fbf3bc" containerID="9edac573e59ec27aa8c518fc6c44e8b27d0649930fdcdff9ec8131b4e4c1e3b3" exitCode=0 Nov 22 02:57:24 crc kubenswrapper[4952]: I1122 02:57:24.810453 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsnxq" event={"ID":"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc","Type":"ContainerDied","Data":"9edac573e59ec27aa8c518fc6c44e8b27d0649930fdcdff9ec8131b4e4c1e3b3"} Nov 22 02:57:24 crc kubenswrapper[4952]: I1122 02:57:24.845664 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f95sp" Nov 22 02:57:24 crc kubenswrapper[4952]: I1122 02:57:24.845718 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f95sp" Nov 22 02:57:26 crc kubenswrapper[4952]: I1122 02:57:26.196866 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-f95sp" podUID="c96d123a-e54d-45b9-aeb1-0083414f67ee" containerName="registry-server" probeResult="failure" output=< Nov 22 02:57:26 crc kubenswrapper[4952]: timeout: failed to connect service ":50051" within 1s Nov 22 02:57:26 crc kubenswrapper[4952]: > Nov 22 02:57:28 crc kubenswrapper[4952]: I1122 02:57:28.341876 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 02:57:28 crc kubenswrapper[4952]: I1122 02:57:28.342237 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 02:57:28 crc kubenswrapper[4952]: I1122 02:57:28.342295 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 02:57:28 crc kubenswrapper[4952]: I1122 02:57:28.342943 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3"} pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 02:57:28 crc kubenswrapper[4952]: I1122 02:57:28.343003 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" containerID="cri-o://a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3" gracePeriod=600 Nov 22 02:57:28 crc kubenswrapper[4952]: E1122 02:57:28.499264 4952 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94f311d8_e9ac_4dd7_bc2c_321490681934.slice/crio-a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3.scope\": RecentStats: unable to find data in memory cache]" Nov 22 02:57:28 crc kubenswrapper[4952]: I1122 02:57:28.606411 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-kkdb8" Nov 22 02:57:29 crc kubenswrapper[4952]: I1122 02:57:29.842712 4952 generic.go:334] "Generic (PLEG): container finished" podID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerID="a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3" exitCode=0 Nov 22 02:57:29 crc kubenswrapper[4952]: I1122 02:57:29.842840 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerDied","Data":"a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3"} Nov 22 02:57:35 crc kubenswrapper[4952]: I1122 02:57:35.180994 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f95sp" Nov 22 02:57:35 crc kubenswrapper[4952]: I1122 02:57:35.240669 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f95sp" Nov 22 02:57:35 crc kubenswrapper[4952]: I1122 02:57:35.425929 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f95sp"] Nov 22 02:57:36 crc kubenswrapper[4952]: I1122 02:57:36.892886 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f95sp" podUID="c96d123a-e54d-45b9-aeb1-0083414f67ee" containerName="registry-server" containerID="cri-o://6d51abb6d833ae5b917fbe422fddc13a5ae76fb625c8f73a33885ed37531d7e3" gracePeriod=2 Nov 22 02:57:38 crc kubenswrapper[4952]: I1122 02:57:38.906117 4952 generic.go:334] "Generic (PLEG): container finished" podID="c96d123a-e54d-45b9-aeb1-0083414f67ee" containerID="6d51abb6d833ae5b917fbe422fddc13a5ae76fb625c8f73a33885ed37531d7e3" exitCode=0 Nov 22 02:57:38 crc kubenswrapper[4952]: I1122 02:57:38.906175 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f95sp" event={"ID":"c96d123a-e54d-45b9-aeb1-0083414f67ee","Type":"ContainerDied","Data":"6d51abb6d833ae5b917fbe422fddc13a5ae76fb625c8f73a33885ed37531d7e3"} Nov 22 02:57:44 crc kubenswrapper[4952]: E1122 02:57:44.847332 4952 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6d51abb6d833ae5b917fbe422fddc13a5ae76fb625c8f73a33885ed37531d7e3 is running failed: container process not found" containerID="6d51abb6d833ae5b917fbe422fddc13a5ae76fb625c8f73a33885ed37531d7e3" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 02:57:44 crc kubenswrapper[4952]: E1122 02:57:44.849054 4952 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6d51abb6d833ae5b917fbe422fddc13a5ae76fb625c8f73a33885ed37531d7e3 is running failed: container process not found" containerID="6d51abb6d833ae5b917fbe422fddc13a5ae76fb625c8f73a33885ed37531d7e3" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 02:57:44 crc kubenswrapper[4952]: E1122 02:57:44.850388 4952 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6d51abb6d833ae5b917fbe422fddc13a5ae76fb625c8f73a33885ed37531d7e3 is running failed: container process not found" containerID="6d51abb6d833ae5b917fbe422fddc13a5ae76fb625c8f73a33885ed37531d7e3" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 02:57:44 crc kubenswrapper[4952]: E1122 02:57:44.850469 4952 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6d51abb6d833ae5b917fbe422fddc13a5ae76fb625c8f73a33885ed37531d7e3 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-f95sp" podUID="c96d123a-e54d-45b9-aeb1-0083414f67ee" containerName="registry-server" Nov 22 02:57:45 crc kubenswrapper[4952]: I1122 02:57:45.740004 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f95sp" Nov 22 02:57:45 crc kubenswrapper[4952]: I1122 02:57:45.769815 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c96d123a-e54d-45b9-aeb1-0083414f67ee-catalog-content\") pod \"c96d123a-e54d-45b9-aeb1-0083414f67ee\" (UID: \"c96d123a-e54d-45b9-aeb1-0083414f67ee\") " Nov 22 02:57:45 crc kubenswrapper[4952]: I1122 02:57:45.769954 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c96d123a-e54d-45b9-aeb1-0083414f67ee-utilities\") pod \"c96d123a-e54d-45b9-aeb1-0083414f67ee\" (UID: \"c96d123a-e54d-45b9-aeb1-0083414f67ee\") " Nov 22 02:57:45 crc kubenswrapper[4952]: I1122 02:57:45.770030 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97wqk\" (UniqueName: \"kubernetes.io/projected/c96d123a-e54d-45b9-aeb1-0083414f67ee-kube-api-access-97wqk\") pod \"c96d123a-e54d-45b9-aeb1-0083414f67ee\" (UID: \"c96d123a-e54d-45b9-aeb1-0083414f67ee\") " Nov 22 02:57:45 crc kubenswrapper[4952]: I1122 02:57:45.770881 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c96d123a-e54d-45b9-aeb1-0083414f67ee-utilities" (OuterVolumeSpecName: "utilities") pod "c96d123a-e54d-45b9-aeb1-0083414f67ee" (UID: "c96d123a-e54d-45b9-aeb1-0083414f67ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:57:45 crc kubenswrapper[4952]: I1122 02:57:45.778290 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c96d123a-e54d-45b9-aeb1-0083414f67ee-kube-api-access-97wqk" (OuterVolumeSpecName: "kube-api-access-97wqk") pod "c96d123a-e54d-45b9-aeb1-0083414f67ee" (UID: "c96d123a-e54d-45b9-aeb1-0083414f67ee"). InnerVolumeSpecName "kube-api-access-97wqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:57:45 crc kubenswrapper[4952]: I1122 02:57:45.835080 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c96d123a-e54d-45b9-aeb1-0083414f67ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c96d123a-e54d-45b9-aeb1-0083414f67ee" (UID: "c96d123a-e54d-45b9-aeb1-0083414f67ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:57:45 crc kubenswrapper[4952]: I1122 02:57:45.871993 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c96d123a-e54d-45b9-aeb1-0083414f67ee-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:45 crc kubenswrapper[4952]: I1122 02:57:45.872043 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c96d123a-e54d-45b9-aeb1-0083414f67ee-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:45 crc kubenswrapper[4952]: I1122 02:57:45.872061 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97wqk\" (UniqueName: \"kubernetes.io/projected/c96d123a-e54d-45b9-aeb1-0083414f67ee-kube-api-access-97wqk\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:45 crc kubenswrapper[4952]: I1122 02:57:45.957891 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f95sp" event={"ID":"c96d123a-e54d-45b9-aeb1-0083414f67ee","Type":"ContainerDied","Data":"f9cc49b212c5c448cb52a607488d8760c951d9d6a1116f71192c3d613c1f210f"} Nov 22 02:57:45 crc kubenswrapper[4952]: I1122 02:57:45.957975 4952 scope.go:117] "RemoveContainer" containerID="6d51abb6d833ae5b917fbe422fddc13a5ae76fb625c8f73a33885ed37531d7e3" Nov 22 02:57:45 crc kubenswrapper[4952]: I1122 02:57:45.957977 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f95sp" Nov 22 02:57:45 crc kubenswrapper[4952]: I1122 02:57:45.993159 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f95sp"] Nov 22 02:57:45 crc kubenswrapper[4952]: I1122 02:57:45.997149 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f95sp"] Nov 22 02:57:46 crc kubenswrapper[4952]: I1122 02:57:46.544753 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c96d123a-e54d-45b9-aeb1-0083414f67ee" path="/var/lib/kubelet/pods/c96d123a-e54d-45b9-aeb1-0083414f67ee/volumes" Nov 22 02:57:46 crc kubenswrapper[4952]: I1122 02:57:46.837118 4952 scope.go:117] "RemoveContainer" containerID="bbcec428faeaafeb6d4f4a978bda33515a91b6571155eb84b6b98033727cde85" Nov 22 02:57:46 crc kubenswrapper[4952]: I1122 02:57:46.893975 4952 scope.go:117] "RemoveContainer" containerID="ab66888561f1f4662da6fa9f55629d4a9d6d195c38c8abe59565e42e7e61161c" Nov 22 02:57:47 crc kubenswrapper[4952]: I1122 02:57:47.984611 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlwls" event={"ID":"a4f424b2-2973-4b6a-99dd-08fd5b237adf","Type":"ContainerStarted","Data":"0aac6f8fe0da0bcf283ce7177feed7c866782aa2b28b84c169d33ac01c6a5f0a"} Nov 22 02:57:47 crc kubenswrapper[4952]: I1122 02:57:47.987599 4952 generic.go:334] "Generic (PLEG): container finished" podID="68fb4fd4-561b-4208-9370-331e71740744" containerID="dafaf3655756e61833978b7facc5f2dc8fa1d3b0bd3f814ed0cda757840e272d" exitCode=0 Nov 22 02:57:47 crc kubenswrapper[4952]: I1122 02:57:47.987651 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r22df" event={"ID":"68fb4fd4-561b-4208-9370-331e71740744","Type":"ContainerDied","Data":"dafaf3655756e61833978b7facc5f2dc8fa1d3b0bd3f814ed0cda757840e272d"} Nov 22 02:57:47 crc kubenswrapper[4952]: I1122 02:57:47.995259 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsnxq" event={"ID":"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc","Type":"ContainerStarted","Data":"3c508a60b0cd4e8e78db6b740e20a369b0ab29f535772a5280f29ff93b683e51"} Nov 22 02:57:48 crc kubenswrapper[4952]: I1122 02:57:47.999966 4952 generic.go:334] "Generic (PLEG): container finished" podID="6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9" containerID="02dd6b875a785c0b53e5d7907b179c84cfc80be9b823542fa1cac872b35cd0e9" exitCode=0 Nov 22 02:57:48 crc kubenswrapper[4952]: I1122 02:57:48.000033 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcfhk" event={"ID":"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9","Type":"ContainerDied","Data":"02dd6b875a785c0b53e5d7907b179c84cfc80be9b823542fa1cac872b35cd0e9"} Nov 22 02:57:48 crc kubenswrapper[4952]: I1122 02:57:48.009077 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"283175bce2bea1ef54ec44437a69cc09a90aa62ad36cd51f79f6632b87a3f11a"} Nov 22 02:57:48 crc kubenswrapper[4952]: I1122 02:57:48.010758 4952 generic.go:334] "Generic (PLEG): container finished" podID="fc00999c-0e40-4bca-b54a-2d416d925514" containerID="886b97d2927f0e02be003bca48f6fba44a03ba89f8b5bd935fb49623c811e22a" exitCode=0 Nov 22 02:57:48 crc kubenswrapper[4952]: I1122 02:57:48.010817 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lklxp" event={"ID":"fc00999c-0e40-4bca-b54a-2d416d925514","Type":"ContainerDied","Data":"886b97d2927f0e02be003bca48f6fba44a03ba89f8b5bd935fb49623c811e22a"} Nov 22 02:57:48 crc kubenswrapper[4952]: I1122 02:57:48.019567 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hlwls" podStartSLOduration=5.310790777 podStartE2EDuration="1m25.019513362s" podCreationTimestamp="2025-11-22 02:56:23 +0000 UTC" firstStartedPulling="2025-11-22 02:56:27.118778004 +0000 UTC m=+151.424795277" lastFinishedPulling="2025-11-22 02:57:46.827500589 +0000 UTC m=+231.133517862" observedRunningTime="2025-11-22 02:57:48.016320888 +0000 UTC m=+232.322338161" watchObservedRunningTime="2025-11-22 02:57:48.019513362 +0000 UTC m=+232.325530635" Nov 22 02:57:48 crc kubenswrapper[4952]: I1122 02:57:48.027304 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqxwp" event={"ID":"b700aff5-fe4e-45d2-840c-b4a5390b6b27","Type":"ContainerStarted","Data":"5ef5458a18461227b1eb62851cd33917f181f3d15ea43722024373acf4c715a4"} Nov 22 02:57:48 crc kubenswrapper[4952]: I1122 02:57:48.032876 4952 generic.go:334] "Generic (PLEG): container finished" podID="d2d87343-7102-459c-a231-294939870dc5" containerID="c8cd752652ee4e49e068a0017f0a20455aae036612eb88f155d96a527acf7ce0" exitCode=0 Nov 22 02:57:48 crc kubenswrapper[4952]: I1122 02:57:48.032945 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwg2f" event={"ID":"d2d87343-7102-459c-a231-294939870dc5","Type":"ContainerDied","Data":"c8cd752652ee4e49e068a0017f0a20455aae036612eb88f155d96a527acf7ce0"} Nov 22 02:57:48 crc kubenswrapper[4952]: I1122 02:57:48.134070 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bsnxq" podStartSLOduration=5.501614989 podStartE2EDuration="1m24.134040084s" podCreationTimestamp="2025-11-22 02:56:24 +0000 UTC" firstStartedPulling="2025-11-22 02:56:28.182850046 +0000 UTC m=+152.488867309" lastFinishedPulling="2025-11-22 02:57:46.815275131 +0000 UTC m=+231.121292404" observedRunningTime="2025-11-22 02:57:48.13048229 +0000 UTC m=+232.436499583" watchObservedRunningTime="2025-11-22 02:57:48.134040084 +0000 UTC m=+232.440057357" Nov 22 02:57:48 crc kubenswrapper[4952]: I1122 02:57:48.176558 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kqxwp" podStartSLOduration=4.576484457 podStartE2EDuration="1m22.176511988s" podCreationTimestamp="2025-11-22 02:56:26 +0000 UTC" firstStartedPulling="2025-11-22 02:56:29.227625222 +0000 UTC m=+153.533642495" lastFinishedPulling="2025-11-22 02:57:46.827652753 +0000 UTC m=+231.133670026" observedRunningTime="2025-11-22 02:57:48.174891801 +0000 UTC m=+232.480909084" watchObservedRunningTime="2025-11-22 02:57:48.176511988 +0000 UTC m=+232.482529281" Nov 22 02:57:48 crc kubenswrapper[4952]: I1122 02:57:48.322684 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vpkgq"] Nov 22 02:57:49 crc kubenswrapper[4952]: I1122 02:57:49.074869 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwg2f" event={"ID":"d2d87343-7102-459c-a231-294939870dc5","Type":"ContainerStarted","Data":"efc6a5b830d3a78b7d51922db034198c8b56d50e7d8f4826990cc5130ede733c"} Nov 22 02:57:49 crc kubenswrapper[4952]: I1122 02:57:49.090150 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcfhk" event={"ID":"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9","Type":"ContainerStarted","Data":"fb60e3f5b14b5c296a3eb54dfa830dc7b649a322e89069bf5bb0dc63456403be"} Nov 22 02:57:49 crc kubenswrapper[4952]: I1122 02:57:49.100361 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lklxp" event={"ID":"fc00999c-0e40-4bca-b54a-2d416d925514","Type":"ContainerStarted","Data":"569e1e54b54852ebcada10fff1c86e024cd4294c4a87c5c51a4897b6f8019940"} Nov 22 02:57:49 crc kubenswrapper[4952]: I1122 02:57:49.110322 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bwg2f" podStartSLOduration=4.68229511 podStartE2EDuration="1m26.110302592s" podCreationTimestamp="2025-11-22 02:56:23 +0000 UTC" firstStartedPulling="2025-11-22 02:56:27.129657683 +0000 UTC m=+151.435674946" lastFinishedPulling="2025-11-22 02:57:48.557665155 +0000 UTC m=+232.863682428" observedRunningTime="2025-11-22 02:57:49.102349519 +0000 UTC m=+233.408366802" watchObservedRunningTime="2025-11-22 02:57:49.110302592 +0000 UTC m=+233.416319875" Nov 22 02:57:49 crc kubenswrapper[4952]: I1122 02:57:49.114606 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r22df" event={"ID":"68fb4fd4-561b-4208-9370-331e71740744","Type":"ContainerStarted","Data":"333fed9760cc964b8176adc21bf3b81b0da93b7984809f570f9a10cef02af524"} Nov 22 02:57:49 crc kubenswrapper[4952]: I1122 02:57:49.138327 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bcfhk" podStartSLOduration=2.882240129 podStartE2EDuration="1m22.138288731s" podCreationTimestamp="2025-11-22 02:56:27 +0000 UTC" firstStartedPulling="2025-11-22 02:56:29.212610377 +0000 UTC m=+153.518627650" lastFinishedPulling="2025-11-22 02:57:48.468658979 +0000 UTC m=+232.774676252" observedRunningTime="2025-11-22 02:57:49.129018019 +0000 UTC m=+233.435035292" watchObservedRunningTime="2025-11-22 02:57:49.138288731 +0000 UTC m=+233.444305994" Nov 22 02:57:49 crc kubenswrapper[4952]: I1122 02:57:49.163567 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lklxp" podStartSLOduration=3.898919553 podStartE2EDuration="1m24.16352546s" podCreationTimestamp="2025-11-22 02:56:25 +0000 UTC" firstStartedPulling="2025-11-22 02:56:28.167861402 +0000 UTC m=+152.473878675" lastFinishedPulling="2025-11-22 02:57:48.432467309 +0000 UTC m=+232.738484582" observedRunningTime="2025-11-22 02:57:49.158870253 +0000 UTC m=+233.464887526" watchObservedRunningTime="2025-11-22 02:57:49.16352546 +0000 UTC m=+233.469542733" Nov 22 02:57:49 crc kubenswrapper[4952]: I1122 02:57:49.184726 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r22df" podStartSLOduration=2.771664077 podStartE2EDuration="1m23.184700599s" podCreationTimestamp="2025-11-22 02:56:26 +0000 UTC" firstStartedPulling="2025-11-22 02:56:28.167586954 +0000 UTC m=+152.473604227" lastFinishedPulling="2025-11-22 02:57:48.580623476 +0000 UTC m=+232.886640749" observedRunningTime="2025-11-22 02:57:49.182173485 +0000 UTC m=+233.488190758" watchObservedRunningTime="2025-11-22 02:57:49.184700599 +0000 UTC m=+233.490717872" Nov 22 02:57:54 crc kubenswrapper[4952]: I1122 02:57:54.222387 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hlwls" Nov 22 02:57:54 crc kubenswrapper[4952]: I1122 02:57:54.223919 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hlwls" Nov 22 02:57:54 crc kubenswrapper[4952]: I1122 02:57:54.281026 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hlwls" Nov 22 02:57:54 crc kubenswrapper[4952]: I1122 02:57:54.617535 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bwg2f" Nov 22 02:57:54 crc kubenswrapper[4952]: I1122 02:57:54.617633 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bwg2f" Nov 22 02:57:54 crc kubenswrapper[4952]: I1122 02:57:54.659620 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bwg2f" Nov 22 02:57:55 crc kubenswrapper[4952]: I1122 02:57:55.027349 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bsnxq" Nov 22 02:57:55 crc kubenswrapper[4952]: I1122 02:57:55.027424 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bsnxq" Nov 22 02:57:55 crc kubenswrapper[4952]: I1122 02:57:55.068532 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bsnxq" Nov 22 02:57:55 crc kubenswrapper[4952]: I1122 02:57:55.203344 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bwg2f" Nov 22 02:57:55 crc kubenswrapper[4952]: I1122 02:57:55.204672 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hlwls" Nov 22 02:57:55 crc kubenswrapper[4952]: I1122 02:57:55.217105 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bsnxq" Nov 22 02:57:56 crc kubenswrapper[4952]: I1122 02:57:56.520004 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lklxp" Nov 22 02:57:56 crc kubenswrapper[4952]: I1122 02:57:56.520057 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lklxp" Nov 22 02:57:56 crc kubenswrapper[4952]: I1122 02:57:56.565102 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lklxp" Nov 22 02:57:56 crc kubenswrapper[4952]: I1122 02:57:56.708357 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r22df" Nov 22 02:57:56 crc kubenswrapper[4952]: I1122 02:57:56.708415 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r22df" Nov 22 02:57:56 crc kubenswrapper[4952]: I1122 02:57:56.751809 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r22df" Nov 22 02:57:56 crc kubenswrapper[4952]: I1122 02:57:56.799359 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bsnxq"] Nov 22 02:57:57 crc kubenswrapper[4952]: I1122 02:57:57.170870 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bsnxq" podUID="67059cf9-4ef8-46a6-9012-fe9a32fbf3bc" containerName="registry-server" containerID="cri-o://3c508a60b0cd4e8e78db6b740e20a369b0ab29f535772a5280f29ff93b683e51" gracePeriod=2 Nov 22 02:57:57 crc kubenswrapper[4952]: I1122 02:57:57.226267 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lklxp" Nov 22 02:57:57 crc kubenswrapper[4952]: I1122 02:57:57.248247 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r22df" Nov 22 02:57:57 crc kubenswrapper[4952]: I1122 02:57:57.338326 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kqxwp" Nov 22 02:57:57 crc kubenswrapper[4952]: I1122 02:57:57.338747 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kqxwp" Nov 22 02:57:57 crc kubenswrapper[4952]: I1122 02:57:57.399444 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kqxwp" Nov 22 02:57:57 crc kubenswrapper[4952]: I1122 02:57:57.563422 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bsnxq" Nov 22 02:57:57 crc kubenswrapper[4952]: I1122 02:57:57.696036 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67059cf9-4ef8-46a6-9012-fe9a32fbf3bc-catalog-content\") pod \"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc\" (UID: \"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc\") " Nov 22 02:57:57 crc kubenswrapper[4952]: I1122 02:57:57.696133 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbxz4\" (UniqueName: \"kubernetes.io/projected/67059cf9-4ef8-46a6-9012-fe9a32fbf3bc-kube-api-access-rbxz4\") pod \"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc\" (UID: \"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc\") " Nov 22 02:57:57 crc kubenswrapper[4952]: I1122 02:57:57.696386 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67059cf9-4ef8-46a6-9012-fe9a32fbf3bc-utilities\") pod \"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc\" (UID: \"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc\") " Nov 22 02:57:57 crc kubenswrapper[4952]: I1122 02:57:57.697270 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67059cf9-4ef8-46a6-9012-fe9a32fbf3bc-utilities" (OuterVolumeSpecName: "utilities") pod "67059cf9-4ef8-46a6-9012-fe9a32fbf3bc" (UID: "67059cf9-4ef8-46a6-9012-fe9a32fbf3bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:57:57 crc kubenswrapper[4952]: I1122 02:57:57.708982 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67059cf9-4ef8-46a6-9012-fe9a32fbf3bc-kube-api-access-rbxz4" (OuterVolumeSpecName: "kube-api-access-rbxz4") pod "67059cf9-4ef8-46a6-9012-fe9a32fbf3bc" (UID: "67059cf9-4ef8-46a6-9012-fe9a32fbf3bc"). InnerVolumeSpecName "kube-api-access-rbxz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:57:57 crc kubenswrapper[4952]: I1122 02:57:57.768846 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67059cf9-4ef8-46a6-9012-fe9a32fbf3bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67059cf9-4ef8-46a6-9012-fe9a32fbf3bc" (UID: "67059cf9-4ef8-46a6-9012-fe9a32fbf3bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:57:57 crc kubenswrapper[4952]: I1122 02:57:57.771374 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bcfhk" Nov 22 02:57:57 crc kubenswrapper[4952]: I1122 02:57:57.771712 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bcfhk" Nov 22 02:57:57 crc kubenswrapper[4952]: I1122 02:57:57.798590 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67059cf9-4ef8-46a6-9012-fe9a32fbf3bc-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:57 crc kubenswrapper[4952]: I1122 02:57:57.798632 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67059cf9-4ef8-46a6-9012-fe9a32fbf3bc-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:57 crc kubenswrapper[4952]: I1122 02:57:57.798646 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbxz4\" (UniqueName: \"kubernetes.io/projected/67059cf9-4ef8-46a6-9012-fe9a32fbf3bc-kube-api-access-rbxz4\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:57 crc kubenswrapper[4952]: I1122 02:57:57.816706 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bcfhk" Nov 22 02:57:58 crc kubenswrapper[4952]: I1122 02:57:58.179131 4952 generic.go:334] "Generic (PLEG): container finished" podID="67059cf9-4ef8-46a6-9012-fe9a32fbf3bc" containerID="3c508a60b0cd4e8e78db6b740e20a369b0ab29f535772a5280f29ff93b683e51" exitCode=0 Nov 22 02:57:58 crc kubenswrapper[4952]: I1122 02:57:58.179223 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bsnxq" Nov 22 02:57:58 crc kubenswrapper[4952]: I1122 02:57:58.179204 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsnxq" event={"ID":"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc","Type":"ContainerDied","Data":"3c508a60b0cd4e8e78db6b740e20a369b0ab29f535772a5280f29ff93b683e51"} Nov 22 02:57:58 crc kubenswrapper[4952]: I1122 02:57:58.179416 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsnxq" event={"ID":"67059cf9-4ef8-46a6-9012-fe9a32fbf3bc","Type":"ContainerDied","Data":"8c5e33a759d0b7a564075144bf9469850bc6812d1d6c32b8df6bc503c600bc2b"} Nov 22 02:57:58 crc kubenswrapper[4952]: I1122 02:57:58.179455 4952 scope.go:117] "RemoveContainer" containerID="3c508a60b0cd4e8e78db6b740e20a369b0ab29f535772a5280f29ff93b683e51" Nov 22 02:57:58 crc kubenswrapper[4952]: I1122 02:57:58.198991 4952 scope.go:117] "RemoveContainer" containerID="9edac573e59ec27aa8c518fc6c44e8b27d0649930fdcdff9ec8131b4e4c1e3b3" Nov 22 02:57:58 crc kubenswrapper[4952]: I1122 02:57:58.211786 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bsnxq"] Nov 22 02:57:58 crc kubenswrapper[4952]: I1122 02:57:58.228433 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bsnxq"] Nov 22 02:57:58 crc kubenswrapper[4952]: I1122 02:57:58.246828 4952 scope.go:117] "RemoveContainer" containerID="4a2122ab86cff1fc3720f2fec7e2486585d91f259f9a1762855355bd114031a7" Nov 22 02:57:58 crc kubenswrapper[4952]: I1122 02:57:58.250839 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kqxwp" Nov 22 02:57:58 crc kubenswrapper[4952]: I1122 02:57:58.254032 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bcfhk" Nov 22 02:57:58 crc kubenswrapper[4952]: I1122 02:57:58.324528 4952 scope.go:117] "RemoveContainer" containerID="3c508a60b0cd4e8e78db6b740e20a369b0ab29f535772a5280f29ff93b683e51" Nov 22 02:57:58 crc kubenswrapper[4952]: E1122 02:57:58.325566 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c508a60b0cd4e8e78db6b740e20a369b0ab29f535772a5280f29ff93b683e51\": container with ID starting with 3c508a60b0cd4e8e78db6b740e20a369b0ab29f535772a5280f29ff93b683e51 not found: ID does not exist" containerID="3c508a60b0cd4e8e78db6b740e20a369b0ab29f535772a5280f29ff93b683e51" Nov 22 02:57:58 crc kubenswrapper[4952]: I1122 02:57:58.325596 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c508a60b0cd4e8e78db6b740e20a369b0ab29f535772a5280f29ff93b683e51"} err="failed to get container status \"3c508a60b0cd4e8e78db6b740e20a369b0ab29f535772a5280f29ff93b683e51\": rpc error: code = NotFound desc = could not find container \"3c508a60b0cd4e8e78db6b740e20a369b0ab29f535772a5280f29ff93b683e51\": container with ID starting with 3c508a60b0cd4e8e78db6b740e20a369b0ab29f535772a5280f29ff93b683e51 not found: ID does not exist" Nov 22 02:57:58 crc kubenswrapper[4952]: I1122 02:57:58.325616 4952 scope.go:117] "RemoveContainer" containerID="9edac573e59ec27aa8c518fc6c44e8b27d0649930fdcdff9ec8131b4e4c1e3b3" Nov 22 02:57:58 crc kubenswrapper[4952]: E1122 02:57:58.327709 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9edac573e59ec27aa8c518fc6c44e8b27d0649930fdcdff9ec8131b4e4c1e3b3\": container with ID starting with 9edac573e59ec27aa8c518fc6c44e8b27d0649930fdcdff9ec8131b4e4c1e3b3 not found: ID does not exist" containerID="9edac573e59ec27aa8c518fc6c44e8b27d0649930fdcdff9ec8131b4e4c1e3b3" Nov 22 02:57:58 crc kubenswrapper[4952]: I1122 02:57:58.327756 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9edac573e59ec27aa8c518fc6c44e8b27d0649930fdcdff9ec8131b4e4c1e3b3"} err="failed to get container status \"9edac573e59ec27aa8c518fc6c44e8b27d0649930fdcdff9ec8131b4e4c1e3b3\": rpc error: code = NotFound desc = could not find container \"9edac573e59ec27aa8c518fc6c44e8b27d0649930fdcdff9ec8131b4e4c1e3b3\": container with ID starting with 9edac573e59ec27aa8c518fc6c44e8b27d0649930fdcdff9ec8131b4e4c1e3b3 not found: ID does not exist" Nov 22 02:57:58 crc kubenswrapper[4952]: I1122 02:57:58.327786 4952 scope.go:117] "RemoveContainer" containerID="4a2122ab86cff1fc3720f2fec7e2486585d91f259f9a1762855355bd114031a7" Nov 22 02:57:58 crc kubenswrapper[4952]: E1122 02:57:58.328238 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a2122ab86cff1fc3720f2fec7e2486585d91f259f9a1762855355bd114031a7\": container with ID starting with 4a2122ab86cff1fc3720f2fec7e2486585d91f259f9a1762855355bd114031a7 not found: ID does not exist" containerID="4a2122ab86cff1fc3720f2fec7e2486585d91f259f9a1762855355bd114031a7" Nov 22 02:57:58 crc kubenswrapper[4952]: I1122 02:57:58.328265 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a2122ab86cff1fc3720f2fec7e2486585d91f259f9a1762855355bd114031a7"} err="failed to get container status \"4a2122ab86cff1fc3720f2fec7e2486585d91f259f9a1762855355bd114031a7\": rpc error: code = NotFound desc = could not find container \"4a2122ab86cff1fc3720f2fec7e2486585d91f259f9a1762855355bd114031a7\": container with ID starting with 4a2122ab86cff1fc3720f2fec7e2486585d91f259f9a1762855355bd114031a7 not found: ID does not exist" Nov 22 02:57:58 crc kubenswrapper[4952]: I1122 02:57:58.541911 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67059cf9-4ef8-46a6-9012-fe9a32fbf3bc" path="/var/lib/kubelet/pods/67059cf9-4ef8-46a6-9012-fe9a32fbf3bc/volumes" Nov 22 02:57:59 crc kubenswrapper[4952]: I1122 02:57:59.188363 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r22df"] Nov 22 02:57:59 crc kubenswrapper[4952]: I1122 02:57:59.192231 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r22df" podUID="68fb4fd4-561b-4208-9370-331e71740744" containerName="registry-server" containerID="cri-o://333fed9760cc964b8176adc21bf3b81b0da93b7984809f570f9a10cef02af524" gracePeriod=2 Nov 22 02:57:59 crc kubenswrapper[4952]: I1122 02:57:59.661720 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r22df" Nov 22 02:57:59 crc kubenswrapper[4952]: I1122 02:57:59.861155 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xnpv\" (UniqueName: \"kubernetes.io/projected/68fb4fd4-561b-4208-9370-331e71740744-kube-api-access-6xnpv\") pod \"68fb4fd4-561b-4208-9370-331e71740744\" (UID: \"68fb4fd4-561b-4208-9370-331e71740744\") " Nov 22 02:57:59 crc kubenswrapper[4952]: I1122 02:57:59.861222 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68fb4fd4-561b-4208-9370-331e71740744-catalog-content\") pod \"68fb4fd4-561b-4208-9370-331e71740744\" (UID: \"68fb4fd4-561b-4208-9370-331e71740744\") " Nov 22 02:57:59 crc kubenswrapper[4952]: I1122 02:57:59.861401 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68fb4fd4-561b-4208-9370-331e71740744-utilities\") pod \"68fb4fd4-561b-4208-9370-331e71740744\" (UID: \"68fb4fd4-561b-4208-9370-331e71740744\") " Nov 22 02:57:59 crc kubenswrapper[4952]: I1122 02:57:59.863359 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68fb4fd4-561b-4208-9370-331e71740744-utilities" (OuterVolumeSpecName: "utilities") pod "68fb4fd4-561b-4208-9370-331e71740744" (UID: "68fb4fd4-561b-4208-9370-331e71740744"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:57:59 crc kubenswrapper[4952]: I1122 02:57:59.870983 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68fb4fd4-561b-4208-9370-331e71740744-kube-api-access-6xnpv" (OuterVolumeSpecName: "kube-api-access-6xnpv") pod "68fb4fd4-561b-4208-9370-331e71740744" (UID: "68fb4fd4-561b-4208-9370-331e71740744"). InnerVolumeSpecName "kube-api-access-6xnpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:57:59 crc kubenswrapper[4952]: I1122 02:57:59.886329 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68fb4fd4-561b-4208-9370-331e71740744-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68fb4fd4-561b-4208-9370-331e71740744" (UID: "68fb4fd4-561b-4208-9370-331e71740744"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:57:59 crc kubenswrapper[4952]: I1122 02:57:59.962916 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68fb4fd4-561b-4208-9370-331e71740744-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:59 crc kubenswrapper[4952]: I1122 02:57:59.962992 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xnpv\" (UniqueName: \"kubernetes.io/projected/68fb4fd4-561b-4208-9370-331e71740744-kube-api-access-6xnpv\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:59 crc kubenswrapper[4952]: I1122 02:57:59.963008 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68fb4fd4-561b-4208-9370-331e71740744-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:00 crc kubenswrapper[4952]: I1122 02:58:00.208705 4952 generic.go:334] "Generic (PLEG): container finished" podID="68fb4fd4-561b-4208-9370-331e71740744" containerID="333fed9760cc964b8176adc21bf3b81b0da93b7984809f570f9a10cef02af524" exitCode=0 Nov 22 02:58:00 crc kubenswrapper[4952]: I1122 02:58:00.208802 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r22df" event={"ID":"68fb4fd4-561b-4208-9370-331e71740744","Type":"ContainerDied","Data":"333fed9760cc964b8176adc21bf3b81b0da93b7984809f570f9a10cef02af524"} Nov 22 02:58:00 crc kubenswrapper[4952]: I1122 02:58:00.208927 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r22df" event={"ID":"68fb4fd4-561b-4208-9370-331e71740744","Type":"ContainerDied","Data":"d0d8767f6128e122c8505e092346c1c791f7489c922b6855452cbf5c909c3a3a"} Nov 22 02:58:00 crc kubenswrapper[4952]: I1122 02:58:00.208846 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r22df" Nov 22 02:58:00 crc kubenswrapper[4952]: I1122 02:58:00.208972 4952 scope.go:117] "RemoveContainer" containerID="333fed9760cc964b8176adc21bf3b81b0da93b7984809f570f9a10cef02af524" Nov 22 02:58:00 crc kubenswrapper[4952]: I1122 02:58:00.245705 4952 scope.go:117] "RemoveContainer" containerID="dafaf3655756e61833978b7facc5f2dc8fa1d3b0bd3f814ed0cda757840e272d" Nov 22 02:58:00 crc kubenswrapper[4952]: I1122 02:58:00.274268 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r22df"] Nov 22 02:58:00 crc kubenswrapper[4952]: I1122 02:58:00.280010 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r22df"] Nov 22 02:58:00 crc kubenswrapper[4952]: I1122 02:58:00.282594 4952 scope.go:117] "RemoveContainer" containerID="8fa4539cbbf3e920deb07d842e4216359f522c65e99326f3280f0754c462a75e" Nov 22 02:58:00 crc kubenswrapper[4952]: I1122 02:58:00.315835 4952 scope.go:117] "RemoveContainer" containerID="333fed9760cc964b8176adc21bf3b81b0da93b7984809f570f9a10cef02af524" Nov 22 02:58:00 crc kubenswrapper[4952]: E1122 02:58:00.316704 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"333fed9760cc964b8176adc21bf3b81b0da93b7984809f570f9a10cef02af524\": container with ID starting with 333fed9760cc964b8176adc21bf3b81b0da93b7984809f570f9a10cef02af524 not found: ID does not exist" containerID="333fed9760cc964b8176adc21bf3b81b0da93b7984809f570f9a10cef02af524" Nov 22 02:58:00 crc kubenswrapper[4952]: I1122 02:58:00.316829 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"333fed9760cc964b8176adc21bf3b81b0da93b7984809f570f9a10cef02af524"} err="failed to get container status \"333fed9760cc964b8176adc21bf3b81b0da93b7984809f570f9a10cef02af524\": rpc error: code = NotFound desc = could not find container \"333fed9760cc964b8176adc21bf3b81b0da93b7984809f570f9a10cef02af524\": container with ID starting with 333fed9760cc964b8176adc21bf3b81b0da93b7984809f570f9a10cef02af524 not found: ID does not exist" Nov 22 02:58:00 crc kubenswrapper[4952]: I1122 02:58:00.316894 4952 scope.go:117] "RemoveContainer" containerID="dafaf3655756e61833978b7facc5f2dc8fa1d3b0bd3f814ed0cda757840e272d" Nov 22 02:58:00 crc kubenswrapper[4952]: E1122 02:58:00.317499 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dafaf3655756e61833978b7facc5f2dc8fa1d3b0bd3f814ed0cda757840e272d\": container with ID starting with dafaf3655756e61833978b7facc5f2dc8fa1d3b0bd3f814ed0cda757840e272d not found: ID does not exist" containerID="dafaf3655756e61833978b7facc5f2dc8fa1d3b0bd3f814ed0cda757840e272d" Nov 22 02:58:00 crc kubenswrapper[4952]: I1122 02:58:00.317569 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dafaf3655756e61833978b7facc5f2dc8fa1d3b0bd3f814ed0cda757840e272d"} err="failed to get container status \"dafaf3655756e61833978b7facc5f2dc8fa1d3b0bd3f814ed0cda757840e272d\": rpc error: code = NotFound desc = could not find container \"dafaf3655756e61833978b7facc5f2dc8fa1d3b0bd3f814ed0cda757840e272d\": container with ID starting with dafaf3655756e61833978b7facc5f2dc8fa1d3b0bd3f814ed0cda757840e272d not found: ID does not exist" Nov 22 02:58:00 crc kubenswrapper[4952]: I1122 02:58:00.317599 4952 scope.go:117] "RemoveContainer" containerID="8fa4539cbbf3e920deb07d842e4216359f522c65e99326f3280f0754c462a75e" Nov 22 02:58:00 crc kubenswrapper[4952]: E1122 02:58:00.318069 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fa4539cbbf3e920deb07d842e4216359f522c65e99326f3280f0754c462a75e\": container with ID starting with 8fa4539cbbf3e920deb07d842e4216359f522c65e99326f3280f0754c462a75e not found: ID does not exist" containerID="8fa4539cbbf3e920deb07d842e4216359f522c65e99326f3280f0754c462a75e" Nov 22 02:58:00 crc kubenswrapper[4952]: I1122 02:58:00.318138 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fa4539cbbf3e920deb07d842e4216359f522c65e99326f3280f0754c462a75e"} err="failed to get container status \"8fa4539cbbf3e920deb07d842e4216359f522c65e99326f3280f0754c462a75e\": rpc error: code = NotFound desc = could not find container \"8fa4539cbbf3e920deb07d842e4216359f522c65e99326f3280f0754c462a75e\": container with ID starting with 8fa4539cbbf3e920deb07d842e4216359f522c65e99326f3280f0754c462a75e not found: ID does not exist" Nov 22 02:58:00 crc kubenswrapper[4952]: I1122 02:58:00.546524 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68fb4fd4-561b-4208-9370-331e71740744" path="/var/lib/kubelet/pods/68fb4fd4-561b-4208-9370-331e71740744/volumes" Nov 22 02:58:01 crc kubenswrapper[4952]: I1122 02:58:01.587518 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bcfhk"] Nov 22 02:58:01 crc kubenswrapper[4952]: I1122 02:58:01.588057 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bcfhk" podUID="6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9" containerName="registry-server" containerID="cri-o://fb60e3f5b14b5c296a3eb54dfa830dc7b649a322e89069bf5bb0dc63456403be" gracePeriod=2 Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.030059 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bcfhk" Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.200847 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9-catalog-content\") pod \"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9\" (UID: \"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9\") " Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.201045 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9-utilities\") pod \"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9\" (UID: \"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9\") " Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.201153 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xz7h\" (UniqueName: \"kubernetes.io/projected/6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9-kube-api-access-6xz7h\") pod \"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9\" (UID: \"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9\") " Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.202704 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9-utilities" (OuterVolumeSpecName: "utilities") pod "6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9" (UID: "6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.206715 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9-kube-api-access-6xz7h" (OuterVolumeSpecName: "kube-api-access-6xz7h") pod "6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9" (UID: "6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9"). InnerVolumeSpecName "kube-api-access-6xz7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.232281 4952 generic.go:334] "Generic (PLEG): container finished" podID="6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9" containerID="fb60e3f5b14b5c296a3eb54dfa830dc7b649a322e89069bf5bb0dc63456403be" exitCode=0 Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.232380 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcfhk" event={"ID":"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9","Type":"ContainerDied","Data":"fb60e3f5b14b5c296a3eb54dfa830dc7b649a322e89069bf5bb0dc63456403be"} Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.232430 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bcfhk" Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.232487 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcfhk" event={"ID":"6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9","Type":"ContainerDied","Data":"311a328228755158122e49518b2a5fc531a1482855b663b9f489a36977f0b341"} Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.232629 4952 scope.go:117] "RemoveContainer" containerID="fb60e3f5b14b5c296a3eb54dfa830dc7b649a322e89069bf5bb0dc63456403be" Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.263032 4952 scope.go:117] "RemoveContainer" containerID="02dd6b875a785c0b53e5d7907b179c84cfc80be9b823542fa1cac872b35cd0e9" Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.290238 4952 scope.go:117] "RemoveContainer" containerID="0494ea013f98756319bfb590c89a6996927de3c4a7bf5e040bc05f3b645e3f05" Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.303245 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.303295 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xz7h\" (UniqueName: \"kubernetes.io/projected/6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9-kube-api-access-6xz7h\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.319523 4952 scope.go:117] "RemoveContainer" containerID="fb60e3f5b14b5c296a3eb54dfa830dc7b649a322e89069bf5bb0dc63456403be" Nov 22 02:58:02 crc kubenswrapper[4952]: E1122 02:58:02.320444 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb60e3f5b14b5c296a3eb54dfa830dc7b649a322e89069bf5bb0dc63456403be\": container with ID starting with fb60e3f5b14b5c296a3eb54dfa830dc7b649a322e89069bf5bb0dc63456403be not found: ID does not exist" containerID="fb60e3f5b14b5c296a3eb54dfa830dc7b649a322e89069bf5bb0dc63456403be" Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.320529 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb60e3f5b14b5c296a3eb54dfa830dc7b649a322e89069bf5bb0dc63456403be"} err="failed to get container status \"fb60e3f5b14b5c296a3eb54dfa830dc7b649a322e89069bf5bb0dc63456403be\": rpc error: code = NotFound desc = could not find container \"fb60e3f5b14b5c296a3eb54dfa830dc7b649a322e89069bf5bb0dc63456403be\": container with ID starting with fb60e3f5b14b5c296a3eb54dfa830dc7b649a322e89069bf5bb0dc63456403be not found: ID does not exist" Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.320612 4952 scope.go:117] "RemoveContainer" containerID="02dd6b875a785c0b53e5d7907b179c84cfc80be9b823542fa1cac872b35cd0e9" Nov 22 02:58:02 crc kubenswrapper[4952]: E1122 02:58:02.321198 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02dd6b875a785c0b53e5d7907b179c84cfc80be9b823542fa1cac872b35cd0e9\": container with ID starting with 02dd6b875a785c0b53e5d7907b179c84cfc80be9b823542fa1cac872b35cd0e9 not found: ID does not exist" containerID="02dd6b875a785c0b53e5d7907b179c84cfc80be9b823542fa1cac872b35cd0e9" Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.321281 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02dd6b875a785c0b53e5d7907b179c84cfc80be9b823542fa1cac872b35cd0e9"} err="failed to get container status \"02dd6b875a785c0b53e5d7907b179c84cfc80be9b823542fa1cac872b35cd0e9\": rpc error: code = NotFound desc = could not find container \"02dd6b875a785c0b53e5d7907b179c84cfc80be9b823542fa1cac872b35cd0e9\": container with ID starting with 02dd6b875a785c0b53e5d7907b179c84cfc80be9b823542fa1cac872b35cd0e9 not found: ID does not exist" Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.321345 4952 scope.go:117] "RemoveContainer" containerID="0494ea013f98756319bfb590c89a6996927de3c4a7bf5e040bc05f3b645e3f05" Nov 22 02:58:02 crc kubenswrapper[4952]: E1122 02:58:02.321949 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0494ea013f98756319bfb590c89a6996927de3c4a7bf5e040bc05f3b645e3f05\": container with ID starting with 0494ea013f98756319bfb590c89a6996927de3c4a7bf5e040bc05f3b645e3f05 not found: ID does not exist" containerID="0494ea013f98756319bfb590c89a6996927de3c4a7bf5e040bc05f3b645e3f05" Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.321993 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0494ea013f98756319bfb590c89a6996927de3c4a7bf5e040bc05f3b645e3f05"} err="failed to get container status \"0494ea013f98756319bfb590c89a6996927de3c4a7bf5e040bc05f3b645e3f05\": rpc error: code = NotFound desc = could not find container \"0494ea013f98756319bfb590c89a6996927de3c4a7bf5e040bc05f3b645e3f05\": container with ID starting with 0494ea013f98756319bfb590c89a6996927de3c4a7bf5e040bc05f3b645e3f05 not found: ID does not exist" Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.328493 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9" (UID: "6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.404867 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.589019 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bcfhk"] Nov 22 02:58:02 crc kubenswrapper[4952]: I1122 02:58:02.602476 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bcfhk"] Nov 22 02:58:04 crc kubenswrapper[4952]: I1122 02:58:04.544318 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9" path="/var/lib/kubelet/pods/6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9/volumes" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.361939 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" podUID="05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" containerName="oauth-openshift" containerID="cri-o://a628aa836cb3c11cc16f9c3073c2bc775c7cbd7705975312b4668113c9f4f03a" gracePeriod=15 Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.843028 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.878895 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc"] Nov 22 02:58:13 crc kubenswrapper[4952]: E1122 02:58:13.879155 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c96d123a-e54d-45b9-aeb1-0083414f67ee" containerName="extract-content" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.879169 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="c96d123a-e54d-45b9-aeb1-0083414f67ee" containerName="extract-content" Nov 22 02:58:13 crc kubenswrapper[4952]: E1122 02:58:13.879181 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c96d123a-e54d-45b9-aeb1-0083414f67ee" containerName="registry-server" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.879189 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="c96d123a-e54d-45b9-aeb1-0083414f67ee" containerName="registry-server" Nov 22 02:58:13 crc kubenswrapper[4952]: E1122 02:58:13.879199 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67059cf9-4ef8-46a6-9012-fe9a32fbf3bc" containerName="extract-content" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.879207 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="67059cf9-4ef8-46a6-9012-fe9a32fbf3bc" containerName="extract-content" Nov 22 02:58:13 crc kubenswrapper[4952]: E1122 02:58:13.879225 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9" containerName="registry-server" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.879233 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9" containerName="registry-server" Nov 22 02:58:13 crc kubenswrapper[4952]: E1122 02:58:13.879244 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9" containerName="extract-content" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.879251 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9" containerName="extract-content" Nov 22 02:58:13 crc kubenswrapper[4952]: E1122 02:58:13.879263 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67059cf9-4ef8-46a6-9012-fe9a32fbf3bc" containerName="registry-server" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.879272 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="67059cf9-4ef8-46a6-9012-fe9a32fbf3bc" containerName="registry-server" Nov 22 02:58:13 crc kubenswrapper[4952]: E1122 02:58:13.879284 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68fb4fd4-561b-4208-9370-331e71740744" containerName="registry-server" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.879291 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="68fb4fd4-561b-4208-9370-331e71740744" containerName="registry-server" Nov 22 02:58:13 crc kubenswrapper[4952]: E1122 02:58:13.879303 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67059cf9-4ef8-46a6-9012-fe9a32fbf3bc" containerName="extract-utilities" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.879311 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="67059cf9-4ef8-46a6-9012-fe9a32fbf3bc" containerName="extract-utilities" Nov 22 02:58:13 crc kubenswrapper[4952]: E1122 02:58:13.879324 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c96d123a-e54d-45b9-aeb1-0083414f67ee" containerName="extract-utilities" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.879332 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="c96d123a-e54d-45b9-aeb1-0083414f67ee" containerName="extract-utilities" Nov 22 02:58:13 crc kubenswrapper[4952]: E1122 02:58:13.879342 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9" containerName="extract-utilities" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.879350 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9" containerName="extract-utilities" Nov 22 02:58:13 crc kubenswrapper[4952]: E1122 02:58:13.879364 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" containerName="oauth-openshift" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.879374 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" containerName="oauth-openshift" Nov 22 02:58:13 crc kubenswrapper[4952]: E1122 02:58:13.879385 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68fb4fd4-561b-4208-9370-331e71740744" containerName="extract-content" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.879392 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="68fb4fd4-561b-4208-9370-331e71740744" containerName="extract-content" Nov 22 02:58:13 crc kubenswrapper[4952]: E1122 02:58:13.879402 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68fb4fd4-561b-4208-9370-331e71740744" containerName="extract-utilities" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.879410 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="68fb4fd4-561b-4208-9370-331e71740744" containerName="extract-utilities" Nov 22 02:58:13 crc kubenswrapper[4952]: E1122 02:58:13.879419 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac254f5-32f7-4cfd-a411-422a7398dc15" containerName="pruner" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.879427 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac254f5-32f7-4cfd-a411-422a7398dc15" containerName="pruner" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.879558 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" containerName="oauth-openshift" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.879574 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ac254f5-32f7-4cfd-a411-422a7398dc15" containerName="pruner" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.879587 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b9c9780-c5e0-489d-b67d-7d4bc2dfb6f9" containerName="registry-server" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.879597 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="68fb4fd4-561b-4208-9370-331e71740744" containerName="registry-server" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.879608 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="c96d123a-e54d-45b9-aeb1-0083414f67ee" containerName="registry-server" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.879617 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="67059cf9-4ef8-46a6-9012-fe9a32fbf3bc" containerName="registry-server" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.880076 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:13 crc kubenswrapper[4952]: I1122 02:58:13.901816 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc"] Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.004869 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-session\") pod \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.004957 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-serving-cert\") pod \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.005021 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-audit-policies\") pod \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.005065 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-template-provider-selection\") pod \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.005097 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-audit-dir\") pod \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.005257 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" (UID: "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.005776 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-template-error\") pod \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.005925 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-ocp-branding-template\") pod \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.005966 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-trusted-ca-bundle\") pod \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.006000 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-template-login\") pod \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.006041 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-cliconfig\") pod \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.006084 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-idp-0-file-data\") pod \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.006313 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" (UID: "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.006800 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsj25\" (UniqueName: \"kubernetes.io/projected/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-kube-api-access-vsj25\") pod \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.006960 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" (UID: "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.007377 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" (UID: "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.007558 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-router-certs\") pod \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.007607 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-service-ca\") pod \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\" (UID: \"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e\") " Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.007854 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.007907 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-service-ca\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.007939 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.007963 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-session\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.008021 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b627f814-fadb-4366-9944-efb089259ebf-audit-dir\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.008066 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.008157 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" (UID: "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.008277 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.008371 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.008434 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-user-template-error\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.008472 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-user-template-login\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.008532 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlgck\" (UniqueName: \"kubernetes.io/projected/b627f814-fadb-4366-9944-efb089259ebf-kube-api-access-hlgck\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.008597 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-router-certs\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.008660 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b627f814-fadb-4366-9944-efb089259ebf-audit-policies\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.008725 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.008848 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.008876 4952 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.008901 4952 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.008916 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.008930 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.013427 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" (UID: "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.014539 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" (UID: "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.014804 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-kube-api-access-vsj25" (OuterVolumeSpecName: "kube-api-access-vsj25") pod "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" (UID: "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e"). InnerVolumeSpecName "kube-api-access-vsj25". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.026693 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" (UID: "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.027438 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" (UID: "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.027790 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" (UID: "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.028066 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" (UID: "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.028279 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" (UID: "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.028592 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" (UID: "05f9129d-efb4-4fc2-b329-ec9adb9a9f3e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.110523 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.110900 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.111043 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-service-ca\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.111143 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.111248 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-session\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.111360 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b627f814-fadb-4366-9944-efb089259ebf-audit-dir\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.111622 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.111735 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.111906 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.112033 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-user-template-error\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.112136 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-user-template-login\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.111893 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b627f814-fadb-4366-9944-efb089259ebf-audit-dir\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.112248 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlgck\" (UniqueName: \"kubernetes.io/projected/b627f814-fadb-4366-9944-efb089259ebf-kube-api-access-hlgck\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.112607 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-router-certs\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.113020 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.113126 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b627f814-fadb-4366-9944-efb089259ebf-audit-policies\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.113192 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-service-ca\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.113436 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.113468 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.113488 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.113502 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.113514 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.113528 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsj25\" (UniqueName: \"kubernetes.io/projected/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-kube-api-access-vsj25\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.113556 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.113572 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.113582 4952 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.113581 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.114919 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b627f814-fadb-4366-9944-efb089259ebf-audit-policies\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.117712 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.118164 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.118162 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-user-template-error\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.118277 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.118570 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-session\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.119724 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.119898 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-system-router-certs\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.120492 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b627f814-fadb-4366-9944-efb089259ebf-v4-0-config-user-template-login\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.141853 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlgck\" (UniqueName: \"kubernetes.io/projected/b627f814-fadb-4366-9944-efb089259ebf-kube-api-access-hlgck\") pod \"oauth-openshift-bc9f7ddc4-mr9bc\" (UID: \"b627f814-fadb-4366-9944-efb089259ebf\") " pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.213293 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.323975 4952 generic.go:334] "Generic (PLEG): container finished" podID="05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" containerID="a628aa836cb3c11cc16f9c3073c2bc775c7cbd7705975312b4668113c9f4f03a" exitCode=0 Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.324036 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" event={"ID":"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e","Type":"ContainerDied","Data":"a628aa836cb3c11cc16f9c3073c2bc775c7cbd7705975312b4668113c9f4f03a"} Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.324074 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" event={"ID":"05f9129d-efb4-4fc2-b329-ec9adb9a9f3e","Type":"ContainerDied","Data":"8d1ebf7c3495dec7892d17673cb7f5a7ecbe4910a1202567449676018cd9f73a"} Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.324097 4952 scope.go:117] "RemoveContainer" containerID="a628aa836cb3c11cc16f9c3073c2bc775c7cbd7705975312b4668113c9f4f03a" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.324242 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vpkgq" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.376756 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vpkgq"] Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.377675 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vpkgq"] Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.381957 4952 scope.go:117] "RemoveContainer" containerID="a628aa836cb3c11cc16f9c3073c2bc775c7cbd7705975312b4668113c9f4f03a" Nov 22 02:58:14 crc kubenswrapper[4952]: E1122 02:58:14.383744 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a628aa836cb3c11cc16f9c3073c2bc775c7cbd7705975312b4668113c9f4f03a\": container with ID starting with a628aa836cb3c11cc16f9c3073c2bc775c7cbd7705975312b4668113c9f4f03a not found: ID does not exist" containerID="a628aa836cb3c11cc16f9c3073c2bc775c7cbd7705975312b4668113c9f4f03a" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.383813 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a628aa836cb3c11cc16f9c3073c2bc775c7cbd7705975312b4668113c9f4f03a"} err="failed to get container status \"a628aa836cb3c11cc16f9c3073c2bc775c7cbd7705975312b4668113c9f4f03a\": rpc error: code = NotFound desc = could not find container \"a628aa836cb3c11cc16f9c3073c2bc775c7cbd7705975312b4668113c9f4f03a\": container with ID starting with a628aa836cb3c11cc16f9c3073c2bc775c7cbd7705975312b4668113c9f4f03a not found: ID does not exist" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.542803 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05f9129d-efb4-4fc2-b329-ec9adb9a9f3e" path="/var/lib/kubelet/pods/05f9129d-efb4-4fc2-b329-ec9adb9a9f3e/volumes" Nov 22 02:58:14 crc kubenswrapper[4952]: I1122 02:58:14.544320 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc"] Nov 22 02:58:15 crc kubenswrapper[4952]: I1122 02:58:15.337380 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" event={"ID":"b627f814-fadb-4366-9944-efb089259ebf","Type":"ContainerStarted","Data":"ca5c3c54b3fbbea9041d3e7f2beedc4ea7275fbef43beabe934a967a6e919e94"} Nov 22 02:58:15 crc kubenswrapper[4952]: I1122 02:58:15.337457 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" event={"ID":"b627f814-fadb-4366-9944-efb089259ebf","Type":"ContainerStarted","Data":"524464c75ab5b205f5465fcd90a7c84899099767042362e209db8c5a2836d96f"} Nov 22 02:58:15 crc kubenswrapper[4952]: I1122 02:58:15.338218 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:15 crc kubenswrapper[4952]: I1122 02:58:15.348677 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" Nov 22 02:58:15 crc kubenswrapper[4952]: I1122 02:58:15.379009 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-bc9f7ddc4-mr9bc" podStartSLOduration=27.378971109 podStartE2EDuration="27.378971109s" podCreationTimestamp="2025-11-22 02:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:58:15.370352756 +0000 UTC m=+259.676370080" watchObservedRunningTime="2025-11-22 02:58:15.378971109 +0000 UTC m=+259.684988462" Nov 22 02:58:26 crc kubenswrapper[4952]: I1122 02:58:26.829079 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bwg2f"] Nov 22 02:58:26 crc kubenswrapper[4952]: I1122 02:58:26.830147 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bwg2f" podUID="d2d87343-7102-459c-a231-294939870dc5" containerName="registry-server" containerID="cri-o://efc6a5b830d3a78b7d51922db034198c8b56d50e7d8f4826990cc5130ede733c" gracePeriod=30 Nov 22 02:58:26 crc kubenswrapper[4952]: I1122 02:58:26.842401 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hlwls"] Nov 22 02:58:26 crc kubenswrapper[4952]: I1122 02:58:26.842868 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hlwls" podUID="a4f424b2-2973-4b6a-99dd-08fd5b237adf" containerName="registry-server" containerID="cri-o://0aac6f8fe0da0bcf283ce7177feed7c866782aa2b28b84c169d33ac01c6a5f0a" gracePeriod=30 Nov 22 02:58:26 crc kubenswrapper[4952]: I1122 02:58:26.861386 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9g9km"] Nov 22 02:58:26 crc kubenswrapper[4952]: I1122 02:58:26.861790 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" podUID="85b3681c-313d-40d1-b1f9-c8410c81dc20" containerName="marketplace-operator" containerID="cri-o://781787ed0e433869f887b25a2f2072bc585198f96de1f27007bbe50af2017cd6" gracePeriod=30 Nov 22 02:58:26 crc kubenswrapper[4952]: I1122 02:58:26.868218 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lklxp"] Nov 22 02:58:26 crc kubenswrapper[4952]: I1122 02:58:26.868656 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lklxp" podUID="fc00999c-0e40-4bca-b54a-2d416d925514" containerName="registry-server" containerID="cri-o://569e1e54b54852ebcada10fff1c86e024cd4294c4a87c5c51a4897b6f8019940" gracePeriod=30 Nov 22 02:58:26 crc kubenswrapper[4952]: I1122 02:58:26.870253 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kqxwp"] Nov 22 02:58:26 crc kubenswrapper[4952]: I1122 02:58:26.870785 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kqxwp" podUID="b700aff5-fe4e-45d2-840c-b4a5390b6b27" containerName="registry-server" containerID="cri-o://5ef5458a18461227b1eb62851cd33917f181f3d15ea43722024373acf4c715a4" gracePeriod=30 Nov 22 02:58:26 crc kubenswrapper[4952]: I1122 02:58:26.872375 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dg5wr"] Nov 22 02:58:26 crc kubenswrapper[4952]: I1122 02:58:26.873581 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dg5wr" Nov 22 02:58:26 crc kubenswrapper[4952]: I1122 02:58:26.900989 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dg5wr"] Nov 22 02:58:26 crc kubenswrapper[4952]: I1122 02:58:26.968980 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/421c7496-b72b-4558-8064-39b4578d0cda-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dg5wr\" (UID: \"421c7496-b72b-4558-8064-39b4578d0cda\") " pod="openshift-marketplace/marketplace-operator-79b997595-dg5wr" Nov 22 02:58:26 crc kubenswrapper[4952]: I1122 02:58:26.969035 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d29qx\" (UniqueName: \"kubernetes.io/projected/421c7496-b72b-4558-8064-39b4578d0cda-kube-api-access-d29qx\") pod \"marketplace-operator-79b997595-dg5wr\" (UID: \"421c7496-b72b-4558-8064-39b4578d0cda\") " pod="openshift-marketplace/marketplace-operator-79b997595-dg5wr" Nov 22 02:58:26 crc kubenswrapper[4952]: I1122 02:58:26.969060 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/421c7496-b72b-4558-8064-39b4578d0cda-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dg5wr\" (UID: \"421c7496-b72b-4558-8064-39b4578d0cda\") " pod="openshift-marketplace/marketplace-operator-79b997595-dg5wr" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.070480 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/421c7496-b72b-4558-8064-39b4578d0cda-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dg5wr\" (UID: \"421c7496-b72b-4558-8064-39b4578d0cda\") " pod="openshift-marketplace/marketplace-operator-79b997595-dg5wr" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.070902 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d29qx\" (UniqueName: \"kubernetes.io/projected/421c7496-b72b-4558-8064-39b4578d0cda-kube-api-access-d29qx\") pod \"marketplace-operator-79b997595-dg5wr\" (UID: \"421c7496-b72b-4558-8064-39b4578d0cda\") " pod="openshift-marketplace/marketplace-operator-79b997595-dg5wr" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.070927 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/421c7496-b72b-4558-8064-39b4578d0cda-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dg5wr\" (UID: \"421c7496-b72b-4558-8064-39b4578d0cda\") " pod="openshift-marketplace/marketplace-operator-79b997595-dg5wr" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.072763 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/421c7496-b72b-4558-8064-39b4578d0cda-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dg5wr\" (UID: \"421c7496-b72b-4558-8064-39b4578d0cda\") " pod="openshift-marketplace/marketplace-operator-79b997595-dg5wr" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.088472 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/421c7496-b72b-4558-8064-39b4578d0cda-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dg5wr\" (UID: \"421c7496-b72b-4558-8064-39b4578d0cda\") " pod="openshift-marketplace/marketplace-operator-79b997595-dg5wr" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.091221 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d29qx\" (UniqueName: \"kubernetes.io/projected/421c7496-b72b-4558-8064-39b4578d0cda-kube-api-access-d29qx\") pod \"marketplace-operator-79b997595-dg5wr\" (UID: \"421c7496-b72b-4558-8064-39b4578d0cda\") " pod="openshift-marketplace/marketplace-operator-79b997595-dg5wr" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.240118 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dg5wr" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.317280 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlwls" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.318864 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwg2f" Nov 22 02:58:27 crc kubenswrapper[4952]: E1122 02:58:27.340526 4952 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ef5458a18461227b1eb62851cd33917f181f3d15ea43722024373acf4c715a4 is running failed: container process not found" containerID="5ef5458a18461227b1eb62851cd33917f181f3d15ea43722024373acf4c715a4" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 02:58:27 crc kubenswrapper[4952]: E1122 02:58:27.340962 4952 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ef5458a18461227b1eb62851cd33917f181f3d15ea43722024373acf4c715a4 is running failed: container process not found" containerID="5ef5458a18461227b1eb62851cd33917f181f3d15ea43722024373acf4c715a4" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 02:58:27 crc kubenswrapper[4952]: E1122 02:58:27.341372 4952 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ef5458a18461227b1eb62851cd33917f181f3d15ea43722024373acf4c715a4 is running failed: container process not found" containerID="5ef5458a18461227b1eb62851cd33917f181f3d15ea43722024373acf4c715a4" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 02:58:27 crc kubenswrapper[4952]: E1122 02:58:27.341397 4952 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ef5458a18461227b1eb62851cd33917f181f3d15ea43722024373acf4c715a4 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-kqxwp" podUID="b700aff5-fe4e-45d2-840c-b4a5390b6b27" containerName="registry-server" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.361083 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.379944 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grbq5\" (UniqueName: \"kubernetes.io/projected/a4f424b2-2973-4b6a-99dd-08fd5b237adf-kube-api-access-grbq5\") pod \"a4f424b2-2973-4b6a-99dd-08fd5b237adf\" (UID: \"a4f424b2-2973-4b6a-99dd-08fd5b237adf\") " Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.380009 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/85b3681c-313d-40d1-b1f9-c8410c81dc20-marketplace-operator-metrics\") pod \"85b3681c-313d-40d1-b1f9-c8410c81dc20\" (UID: \"85b3681c-313d-40d1-b1f9-c8410c81dc20\") " Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.380059 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2d87343-7102-459c-a231-294939870dc5-utilities\") pod \"d2d87343-7102-459c-a231-294939870dc5\" (UID: \"d2d87343-7102-459c-a231-294939870dc5\") " Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.380104 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4f424b2-2973-4b6a-99dd-08fd5b237adf-catalog-content\") pod \"a4f424b2-2973-4b6a-99dd-08fd5b237adf\" (UID: \"a4f424b2-2973-4b6a-99dd-08fd5b237adf\") " Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.380268 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4f424b2-2973-4b6a-99dd-08fd5b237adf-utilities\") pod \"a4f424b2-2973-4b6a-99dd-08fd5b237adf\" (UID: \"a4f424b2-2973-4b6a-99dd-08fd5b237adf\") " Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.380311 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dphj6\" (UniqueName: \"kubernetes.io/projected/d2d87343-7102-459c-a231-294939870dc5-kube-api-access-dphj6\") pod \"d2d87343-7102-459c-a231-294939870dc5\" (UID: \"d2d87343-7102-459c-a231-294939870dc5\") " Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.381225 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2d87343-7102-459c-a231-294939870dc5-utilities" (OuterVolumeSpecName: "utilities") pod "d2d87343-7102-459c-a231-294939870dc5" (UID: "d2d87343-7102-459c-a231-294939870dc5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.383429 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4f424b2-2973-4b6a-99dd-08fd5b237adf-utilities" (OuterVolumeSpecName: "utilities") pod "a4f424b2-2973-4b6a-99dd-08fd5b237adf" (UID: "a4f424b2-2973-4b6a-99dd-08fd5b237adf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.387449 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f424b2-2973-4b6a-99dd-08fd5b237adf-kube-api-access-grbq5" (OuterVolumeSpecName: "kube-api-access-grbq5") pod "a4f424b2-2973-4b6a-99dd-08fd5b237adf" (UID: "a4f424b2-2973-4b6a-99dd-08fd5b237adf"). InnerVolumeSpecName "kube-api-access-grbq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.387864 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2d87343-7102-459c-a231-294939870dc5-kube-api-access-dphj6" (OuterVolumeSpecName: "kube-api-access-dphj6") pod "d2d87343-7102-459c-a231-294939870dc5" (UID: "d2d87343-7102-459c-a231-294939870dc5"). InnerVolumeSpecName "kube-api-access-dphj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.391301 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b3681c-313d-40d1-b1f9-c8410c81dc20-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "85b3681c-313d-40d1-b1f9-c8410c81dc20" (UID: "85b3681c-313d-40d1-b1f9-c8410c81dc20"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.407365 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kqxwp" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.434380 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lklxp" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.442006 4952 generic.go:334] "Generic (PLEG): container finished" podID="a4f424b2-2973-4b6a-99dd-08fd5b237adf" containerID="0aac6f8fe0da0bcf283ce7177feed7c866782aa2b28b84c169d33ac01c6a5f0a" exitCode=0 Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.442249 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlwls" event={"ID":"a4f424b2-2973-4b6a-99dd-08fd5b237adf","Type":"ContainerDied","Data":"0aac6f8fe0da0bcf283ce7177feed7c866782aa2b28b84c169d33ac01c6a5f0a"} Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.442255 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlwls" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.442409 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlwls" event={"ID":"a4f424b2-2973-4b6a-99dd-08fd5b237adf","Type":"ContainerDied","Data":"448f979b5a1a009f4388d263945ec4f19c03e5ba3055c9dc14242630156a4ee8"} Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.443157 4952 scope.go:117] "RemoveContainer" containerID="0aac6f8fe0da0bcf283ce7177feed7c866782aa2b28b84c169d33ac01c6a5f0a" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.465418 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kqxwp" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.465454 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqxwp" event={"ID":"b700aff5-fe4e-45d2-840c-b4a5390b6b27","Type":"ContainerDied","Data":"5ef5458a18461227b1eb62851cd33917f181f3d15ea43722024373acf4c715a4"} Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.464811 4952 generic.go:334] "Generic (PLEG): container finished" podID="b700aff5-fe4e-45d2-840c-b4a5390b6b27" containerID="5ef5458a18461227b1eb62851cd33917f181f3d15ea43722024373acf4c715a4" exitCode=0 Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.465921 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqxwp" event={"ID":"b700aff5-fe4e-45d2-840c-b4a5390b6b27","Type":"ContainerDied","Data":"b5a42f9ea9fe4d8b638cf5b4706917d141617424acbbf84be2c6ca8b00a2ca8d"} Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.471425 4952 generic.go:334] "Generic (PLEG): container finished" podID="d2d87343-7102-459c-a231-294939870dc5" containerID="efc6a5b830d3a78b7d51922db034198c8b56d50e7d8f4826990cc5130ede733c" exitCode=0 Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.471675 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwg2f" event={"ID":"d2d87343-7102-459c-a231-294939870dc5","Type":"ContainerDied","Data":"efc6a5b830d3a78b7d51922db034198c8b56d50e7d8f4826990cc5130ede733c"} Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.471764 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwg2f" event={"ID":"d2d87343-7102-459c-a231-294939870dc5","Type":"ContainerDied","Data":"7176f1aee2d9db649f4bc38e52926e570ce0edf6ae8a2d42a0e8871f84725d57"} Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.471908 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwg2f" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.472780 4952 scope.go:117] "RemoveContainer" containerID="39939ea9e11dea1d07d0fc9afcdb09cff2ea20841f0a0afff65fbeb6ba45eb50" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.475316 4952 generic.go:334] "Generic (PLEG): container finished" podID="85b3681c-313d-40d1-b1f9-c8410c81dc20" containerID="781787ed0e433869f887b25a2f2072bc585198f96de1f27007bbe50af2017cd6" exitCode=0 Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.475497 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" event={"ID":"85b3681c-313d-40d1-b1f9-c8410c81dc20","Type":"ContainerDied","Data":"781787ed0e433869f887b25a2f2072bc585198f96de1f27007bbe50af2017cd6"} Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.476150 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" event={"ID":"85b3681c-313d-40d1-b1f9-c8410c81dc20","Type":"ContainerDied","Data":"5de63b06bcb96d31a642cca322877f5dc11ac3eeff4102c405c4a22def21174b"} Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.476504 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9g9km" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.481157 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85b3681c-313d-40d1-b1f9-c8410c81dc20-marketplace-trusted-ca\") pod \"85b3681c-313d-40d1-b1f9-c8410c81dc20\" (UID: \"85b3681c-313d-40d1-b1f9-c8410c81dc20\") " Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.481188 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b700aff5-fe4e-45d2-840c-b4a5390b6b27-catalog-content\") pod \"b700aff5-fe4e-45d2-840c-b4a5390b6b27\" (UID: \"b700aff5-fe4e-45d2-840c-b4a5390b6b27\") " Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.481203 4952 generic.go:334] "Generic (PLEG): container finished" podID="fc00999c-0e40-4bca-b54a-2d416d925514" containerID="569e1e54b54852ebcada10fff1c86e024cd4294c4a87c5c51a4897b6f8019940" exitCode=0 Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.481218 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b700aff5-fe4e-45d2-840c-b4a5390b6b27-utilities\") pod \"b700aff5-fe4e-45d2-840c-b4a5390b6b27\" (UID: \"b700aff5-fe4e-45d2-840c-b4a5390b6b27\") " Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.481391 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9w85x\" (UniqueName: \"kubernetes.io/projected/b700aff5-fe4e-45d2-840c-b4a5390b6b27-kube-api-access-9w85x\") pod \"b700aff5-fe4e-45d2-840c-b4a5390b6b27\" (UID: \"b700aff5-fe4e-45d2-840c-b4a5390b6b27\") " Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.481468 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc00999c-0e40-4bca-b54a-2d416d925514-catalog-content\") pod \"fc00999c-0e40-4bca-b54a-2d416d925514\" (UID: \"fc00999c-0e40-4bca-b54a-2d416d925514\") " Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.481501 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc00999c-0e40-4bca-b54a-2d416d925514-utilities\") pod \"fc00999c-0e40-4bca-b54a-2d416d925514\" (UID: \"fc00999c-0e40-4bca-b54a-2d416d925514\") " Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.481557 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj8f5\" (UniqueName: \"kubernetes.io/projected/85b3681c-313d-40d1-b1f9-c8410c81dc20-kube-api-access-qj8f5\") pod \"85b3681c-313d-40d1-b1f9-c8410c81dc20\" (UID: \"85b3681c-313d-40d1-b1f9-c8410c81dc20\") " Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.481622 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2d87343-7102-459c-a231-294939870dc5-catalog-content\") pod \"d2d87343-7102-459c-a231-294939870dc5\" (UID: \"d2d87343-7102-459c-a231-294939870dc5\") " Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.481656 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7ql6\" (UniqueName: \"kubernetes.io/projected/fc00999c-0e40-4bca-b54a-2d416d925514-kube-api-access-c7ql6\") pod \"fc00999c-0e40-4bca-b54a-2d416d925514\" (UID: \"fc00999c-0e40-4bca-b54a-2d416d925514\") " Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.481970 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grbq5\" (UniqueName: \"kubernetes.io/projected/a4f424b2-2973-4b6a-99dd-08fd5b237adf-kube-api-access-grbq5\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.481992 4952 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/85b3681c-313d-40d1-b1f9-c8410c81dc20-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.482003 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2d87343-7102-459c-a231-294939870dc5-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.482014 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4f424b2-2973-4b6a-99dd-08fd5b237adf-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.482025 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dphj6\" (UniqueName: \"kubernetes.io/projected/d2d87343-7102-459c-a231-294939870dc5-kube-api-access-dphj6\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.481972 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b700aff5-fe4e-45d2-840c-b4a5390b6b27-utilities" (OuterVolumeSpecName: "utilities") pod "b700aff5-fe4e-45d2-840c-b4a5390b6b27" (UID: "b700aff5-fe4e-45d2-840c-b4a5390b6b27"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.482666 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85b3681c-313d-40d1-b1f9-c8410c81dc20-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "85b3681c-313d-40d1-b1f9-c8410c81dc20" (UID: "85b3681c-313d-40d1-b1f9-c8410c81dc20"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.482877 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc00999c-0e40-4bca-b54a-2d416d925514-utilities" (OuterVolumeSpecName: "utilities") pod "fc00999c-0e40-4bca-b54a-2d416d925514" (UID: "fc00999c-0e40-4bca-b54a-2d416d925514"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.483060 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lklxp" event={"ID":"fc00999c-0e40-4bca-b54a-2d416d925514","Type":"ContainerDied","Data":"569e1e54b54852ebcada10fff1c86e024cd4294c4a87c5c51a4897b6f8019940"} Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.483113 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lklxp" event={"ID":"fc00999c-0e40-4bca-b54a-2d416d925514","Type":"ContainerDied","Data":"81db02489d5dace86350433651e39c4fce6c5080b1e2f4f5206cf2d23c880720"} Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.483268 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lklxp" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.489760 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc00999c-0e40-4bca-b54a-2d416d925514-kube-api-access-c7ql6" (OuterVolumeSpecName: "kube-api-access-c7ql6") pod "fc00999c-0e40-4bca-b54a-2d416d925514" (UID: "fc00999c-0e40-4bca-b54a-2d416d925514"). InnerVolumeSpecName "kube-api-access-c7ql6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.502528 4952 scope.go:117] "RemoveContainer" containerID="073777784f2e0cf98c932ae15eafe629efa39c809310dda5d418c9d0bc9ba3c0" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.502949 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85b3681c-313d-40d1-b1f9-c8410c81dc20-kube-api-access-qj8f5" (OuterVolumeSpecName: "kube-api-access-qj8f5") pod "85b3681c-313d-40d1-b1f9-c8410c81dc20" (UID: "85b3681c-313d-40d1-b1f9-c8410c81dc20"). InnerVolumeSpecName "kube-api-access-qj8f5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.512635 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc00999c-0e40-4bca-b54a-2d416d925514-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc00999c-0e40-4bca-b54a-2d416d925514" (UID: "fc00999c-0e40-4bca-b54a-2d416d925514"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.521656 4952 scope.go:117] "RemoveContainer" containerID="0aac6f8fe0da0bcf283ce7177feed7c866782aa2b28b84c169d33ac01c6a5f0a" Nov 22 02:58:27 crc kubenswrapper[4952]: E1122 02:58:27.522027 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aac6f8fe0da0bcf283ce7177feed7c866782aa2b28b84c169d33ac01c6a5f0a\": container with ID starting with 0aac6f8fe0da0bcf283ce7177feed7c866782aa2b28b84c169d33ac01c6a5f0a not found: ID does not exist" containerID="0aac6f8fe0da0bcf283ce7177feed7c866782aa2b28b84c169d33ac01c6a5f0a" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.522082 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aac6f8fe0da0bcf283ce7177feed7c866782aa2b28b84c169d33ac01c6a5f0a"} err="failed to get container status \"0aac6f8fe0da0bcf283ce7177feed7c866782aa2b28b84c169d33ac01c6a5f0a\": rpc error: code = NotFound desc = could not find container \"0aac6f8fe0da0bcf283ce7177feed7c866782aa2b28b84c169d33ac01c6a5f0a\": container with ID starting with 0aac6f8fe0da0bcf283ce7177feed7c866782aa2b28b84c169d33ac01c6a5f0a not found: ID does not exist" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.522111 4952 scope.go:117] "RemoveContainer" containerID="39939ea9e11dea1d07d0fc9afcdb09cff2ea20841f0a0afff65fbeb6ba45eb50" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.522392 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b700aff5-fe4e-45d2-840c-b4a5390b6b27-kube-api-access-9w85x" (OuterVolumeSpecName: "kube-api-access-9w85x") pod "b700aff5-fe4e-45d2-840c-b4a5390b6b27" (UID: "b700aff5-fe4e-45d2-840c-b4a5390b6b27"). InnerVolumeSpecName "kube-api-access-9w85x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:58:27 crc kubenswrapper[4952]: E1122 02:58:27.522560 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39939ea9e11dea1d07d0fc9afcdb09cff2ea20841f0a0afff65fbeb6ba45eb50\": container with ID starting with 39939ea9e11dea1d07d0fc9afcdb09cff2ea20841f0a0afff65fbeb6ba45eb50 not found: ID does not exist" containerID="39939ea9e11dea1d07d0fc9afcdb09cff2ea20841f0a0afff65fbeb6ba45eb50" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.522610 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39939ea9e11dea1d07d0fc9afcdb09cff2ea20841f0a0afff65fbeb6ba45eb50"} err="failed to get container status \"39939ea9e11dea1d07d0fc9afcdb09cff2ea20841f0a0afff65fbeb6ba45eb50\": rpc error: code = NotFound desc = could not find container \"39939ea9e11dea1d07d0fc9afcdb09cff2ea20841f0a0afff65fbeb6ba45eb50\": container with ID starting with 39939ea9e11dea1d07d0fc9afcdb09cff2ea20841f0a0afff65fbeb6ba45eb50 not found: ID does not exist" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.522667 4952 scope.go:117] "RemoveContainer" containerID="073777784f2e0cf98c932ae15eafe629efa39c809310dda5d418c9d0bc9ba3c0" Nov 22 02:58:27 crc kubenswrapper[4952]: E1122 02:58:27.523023 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"073777784f2e0cf98c932ae15eafe629efa39c809310dda5d418c9d0bc9ba3c0\": container with ID starting with 073777784f2e0cf98c932ae15eafe629efa39c809310dda5d418c9d0bc9ba3c0 not found: ID does not exist" containerID="073777784f2e0cf98c932ae15eafe629efa39c809310dda5d418c9d0bc9ba3c0" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.523063 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"073777784f2e0cf98c932ae15eafe629efa39c809310dda5d418c9d0bc9ba3c0"} err="failed to get container status \"073777784f2e0cf98c932ae15eafe629efa39c809310dda5d418c9d0bc9ba3c0\": rpc error: code = NotFound desc = could not find container \"073777784f2e0cf98c932ae15eafe629efa39c809310dda5d418c9d0bc9ba3c0\": container with ID starting with 073777784f2e0cf98c932ae15eafe629efa39c809310dda5d418c9d0bc9ba3c0 not found: ID does not exist" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.523080 4952 scope.go:117] "RemoveContainer" containerID="5ef5458a18461227b1eb62851cd33917f181f3d15ea43722024373acf4c715a4" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.533072 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4f424b2-2973-4b6a-99dd-08fd5b237adf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4f424b2-2973-4b6a-99dd-08fd5b237adf" (UID: "a4f424b2-2973-4b6a-99dd-08fd5b237adf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.549129 4952 scope.go:117] "RemoveContainer" containerID="249b13e6661ca1214f296b17241161a817f535745239117452f46db059e4f297" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.561926 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2d87343-7102-459c-a231-294939870dc5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2d87343-7102-459c-a231-294939870dc5" (UID: "d2d87343-7102-459c-a231-294939870dc5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.569069 4952 scope.go:117] "RemoveContainer" containerID="1244203c72d010abe4876ff667332ceb95e53526dda7382c1a815fa87517e2ba" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.583126 4952 scope.go:117] "RemoveContainer" containerID="5ef5458a18461227b1eb62851cd33917f181f3d15ea43722024373acf4c715a4" Nov 22 02:58:27 crc kubenswrapper[4952]: E1122 02:58:27.583646 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ef5458a18461227b1eb62851cd33917f181f3d15ea43722024373acf4c715a4\": container with ID starting with 5ef5458a18461227b1eb62851cd33917f181f3d15ea43722024373acf4c715a4 not found: ID does not exist" containerID="5ef5458a18461227b1eb62851cd33917f181f3d15ea43722024373acf4c715a4" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.583688 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ef5458a18461227b1eb62851cd33917f181f3d15ea43722024373acf4c715a4"} err="failed to get container status \"5ef5458a18461227b1eb62851cd33917f181f3d15ea43722024373acf4c715a4\": rpc error: code = NotFound desc = could not find container \"5ef5458a18461227b1eb62851cd33917f181f3d15ea43722024373acf4c715a4\": container with ID starting with 5ef5458a18461227b1eb62851cd33917f181f3d15ea43722024373acf4c715a4 not found: ID does not exist" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.583727 4952 scope.go:117] "RemoveContainer" containerID="249b13e6661ca1214f296b17241161a817f535745239117452f46db059e4f297" Nov 22 02:58:27 crc kubenswrapper[4952]: E1122 02:58:27.584038 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"249b13e6661ca1214f296b17241161a817f535745239117452f46db059e4f297\": container with ID starting with 249b13e6661ca1214f296b17241161a817f535745239117452f46db059e4f297 not found: ID does not exist" containerID="249b13e6661ca1214f296b17241161a817f535745239117452f46db059e4f297" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.584060 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"249b13e6661ca1214f296b17241161a817f535745239117452f46db059e4f297"} err="failed to get container status \"249b13e6661ca1214f296b17241161a817f535745239117452f46db059e4f297\": rpc error: code = NotFound desc = could not find container \"249b13e6661ca1214f296b17241161a817f535745239117452f46db059e4f297\": container with ID starting with 249b13e6661ca1214f296b17241161a817f535745239117452f46db059e4f297 not found: ID does not exist" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.584076 4952 scope.go:117] "RemoveContainer" containerID="1244203c72d010abe4876ff667332ceb95e53526dda7382c1a815fa87517e2ba" Nov 22 02:58:27 crc kubenswrapper[4952]: E1122 02:58:27.584325 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1244203c72d010abe4876ff667332ceb95e53526dda7382c1a815fa87517e2ba\": container with ID starting with 1244203c72d010abe4876ff667332ceb95e53526dda7382c1a815fa87517e2ba not found: ID does not exist" containerID="1244203c72d010abe4876ff667332ceb95e53526dda7382c1a815fa87517e2ba" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.584354 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1244203c72d010abe4876ff667332ceb95e53526dda7382c1a815fa87517e2ba"} err="failed to get container status \"1244203c72d010abe4876ff667332ceb95e53526dda7382c1a815fa87517e2ba\": rpc error: code = NotFound desc = could not find container \"1244203c72d010abe4876ff667332ceb95e53526dda7382c1a815fa87517e2ba\": container with ID starting with 1244203c72d010abe4876ff667332ceb95e53526dda7382c1a815fa87517e2ba not found: ID does not exist" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.584375 4952 scope.go:117] "RemoveContainer" containerID="efc6a5b830d3a78b7d51922db034198c8b56d50e7d8f4826990cc5130ede733c" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.584838 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj8f5\" (UniqueName: \"kubernetes.io/projected/85b3681c-313d-40d1-b1f9-c8410c81dc20-kube-api-access-qj8f5\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.584934 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2d87343-7102-459c-a231-294939870dc5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.585014 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7ql6\" (UniqueName: \"kubernetes.io/projected/fc00999c-0e40-4bca-b54a-2d416d925514-kube-api-access-c7ql6\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.585088 4952 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85b3681c-313d-40d1-b1f9-c8410c81dc20-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.585164 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b700aff5-fe4e-45d2-840c-b4a5390b6b27-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.585242 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4f424b2-2973-4b6a-99dd-08fd5b237adf-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.585325 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9w85x\" (UniqueName: \"kubernetes.io/projected/b700aff5-fe4e-45d2-840c-b4a5390b6b27-kube-api-access-9w85x\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.585408 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc00999c-0e40-4bca-b54a-2d416d925514-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.585487 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc00999c-0e40-4bca-b54a-2d416d925514-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.599473 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b700aff5-fe4e-45d2-840c-b4a5390b6b27-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b700aff5-fe4e-45d2-840c-b4a5390b6b27" (UID: "b700aff5-fe4e-45d2-840c-b4a5390b6b27"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.604586 4952 scope.go:117] "RemoveContainer" containerID="c8cd752652ee4e49e068a0017f0a20455aae036612eb88f155d96a527acf7ce0" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.619700 4952 scope.go:117] "RemoveContainer" containerID="a8fba462adadb147b58fa82c0ebdd0733853eccc62d65345a835cf957de78597" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.633644 4952 scope.go:117] "RemoveContainer" containerID="efc6a5b830d3a78b7d51922db034198c8b56d50e7d8f4826990cc5130ede733c" Nov 22 02:58:27 crc kubenswrapper[4952]: E1122 02:58:27.634396 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efc6a5b830d3a78b7d51922db034198c8b56d50e7d8f4826990cc5130ede733c\": container with ID starting with efc6a5b830d3a78b7d51922db034198c8b56d50e7d8f4826990cc5130ede733c not found: ID does not exist" containerID="efc6a5b830d3a78b7d51922db034198c8b56d50e7d8f4826990cc5130ede733c" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.634459 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efc6a5b830d3a78b7d51922db034198c8b56d50e7d8f4826990cc5130ede733c"} err="failed to get container status \"efc6a5b830d3a78b7d51922db034198c8b56d50e7d8f4826990cc5130ede733c\": rpc error: code = NotFound desc = could not find container \"efc6a5b830d3a78b7d51922db034198c8b56d50e7d8f4826990cc5130ede733c\": container with ID starting with efc6a5b830d3a78b7d51922db034198c8b56d50e7d8f4826990cc5130ede733c not found: ID does not exist" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.634510 4952 scope.go:117] "RemoveContainer" containerID="c8cd752652ee4e49e068a0017f0a20455aae036612eb88f155d96a527acf7ce0" Nov 22 02:58:27 crc kubenswrapper[4952]: E1122 02:58:27.635142 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8cd752652ee4e49e068a0017f0a20455aae036612eb88f155d96a527acf7ce0\": container with ID starting with c8cd752652ee4e49e068a0017f0a20455aae036612eb88f155d96a527acf7ce0 not found: ID does not exist" containerID="c8cd752652ee4e49e068a0017f0a20455aae036612eb88f155d96a527acf7ce0" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.635181 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8cd752652ee4e49e068a0017f0a20455aae036612eb88f155d96a527acf7ce0"} err="failed to get container status \"c8cd752652ee4e49e068a0017f0a20455aae036612eb88f155d96a527acf7ce0\": rpc error: code = NotFound desc = could not find container \"c8cd752652ee4e49e068a0017f0a20455aae036612eb88f155d96a527acf7ce0\": container with ID starting with c8cd752652ee4e49e068a0017f0a20455aae036612eb88f155d96a527acf7ce0 not found: ID does not exist" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.635207 4952 scope.go:117] "RemoveContainer" containerID="a8fba462adadb147b58fa82c0ebdd0733853eccc62d65345a835cf957de78597" Nov 22 02:58:27 crc kubenswrapper[4952]: E1122 02:58:27.635563 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8fba462adadb147b58fa82c0ebdd0733853eccc62d65345a835cf957de78597\": container with ID starting with a8fba462adadb147b58fa82c0ebdd0733853eccc62d65345a835cf957de78597 not found: ID does not exist" containerID="a8fba462adadb147b58fa82c0ebdd0733853eccc62d65345a835cf957de78597" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.635603 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8fba462adadb147b58fa82c0ebdd0733853eccc62d65345a835cf957de78597"} err="failed to get container status \"a8fba462adadb147b58fa82c0ebdd0733853eccc62d65345a835cf957de78597\": rpc error: code = NotFound desc = could not find container \"a8fba462adadb147b58fa82c0ebdd0733853eccc62d65345a835cf957de78597\": container with ID starting with a8fba462adadb147b58fa82c0ebdd0733853eccc62d65345a835cf957de78597 not found: ID does not exist" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.635637 4952 scope.go:117] "RemoveContainer" containerID="781787ed0e433869f887b25a2f2072bc585198f96de1f27007bbe50af2017cd6" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.650484 4952 scope.go:117] "RemoveContainer" containerID="781787ed0e433869f887b25a2f2072bc585198f96de1f27007bbe50af2017cd6" Nov 22 02:58:27 crc kubenswrapper[4952]: E1122 02:58:27.651152 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"781787ed0e433869f887b25a2f2072bc585198f96de1f27007bbe50af2017cd6\": container with ID starting with 781787ed0e433869f887b25a2f2072bc585198f96de1f27007bbe50af2017cd6 not found: ID does not exist" containerID="781787ed0e433869f887b25a2f2072bc585198f96de1f27007bbe50af2017cd6" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.651183 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"781787ed0e433869f887b25a2f2072bc585198f96de1f27007bbe50af2017cd6"} err="failed to get container status \"781787ed0e433869f887b25a2f2072bc585198f96de1f27007bbe50af2017cd6\": rpc error: code = NotFound desc = could not find container \"781787ed0e433869f887b25a2f2072bc585198f96de1f27007bbe50af2017cd6\": container with ID starting with 781787ed0e433869f887b25a2f2072bc585198f96de1f27007bbe50af2017cd6 not found: ID does not exist" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.651210 4952 scope.go:117] "RemoveContainer" containerID="569e1e54b54852ebcada10fff1c86e024cd4294c4a87c5c51a4897b6f8019940" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.664363 4952 scope.go:117] "RemoveContainer" containerID="886b97d2927f0e02be003bca48f6fba44a03ba89f8b5bd935fb49623c811e22a" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.680968 4952 scope.go:117] "RemoveContainer" containerID="420df8bb6b77d55981e25ae92186bf0fe762923a4b6ac414439dcdde01673b3d" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.687002 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b700aff5-fe4e-45d2-840c-b4a5390b6b27-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.696355 4952 scope.go:117] "RemoveContainer" containerID="569e1e54b54852ebcada10fff1c86e024cd4294c4a87c5c51a4897b6f8019940" Nov 22 02:58:27 crc kubenswrapper[4952]: E1122 02:58:27.696950 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"569e1e54b54852ebcada10fff1c86e024cd4294c4a87c5c51a4897b6f8019940\": container with ID starting with 569e1e54b54852ebcada10fff1c86e024cd4294c4a87c5c51a4897b6f8019940 not found: ID does not exist" containerID="569e1e54b54852ebcada10fff1c86e024cd4294c4a87c5c51a4897b6f8019940" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.697001 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"569e1e54b54852ebcada10fff1c86e024cd4294c4a87c5c51a4897b6f8019940"} err="failed to get container status \"569e1e54b54852ebcada10fff1c86e024cd4294c4a87c5c51a4897b6f8019940\": rpc error: code = NotFound desc = could not find container \"569e1e54b54852ebcada10fff1c86e024cd4294c4a87c5c51a4897b6f8019940\": container with ID starting with 569e1e54b54852ebcada10fff1c86e024cd4294c4a87c5c51a4897b6f8019940 not found: ID does not exist" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.697027 4952 scope.go:117] "RemoveContainer" containerID="886b97d2927f0e02be003bca48f6fba44a03ba89f8b5bd935fb49623c811e22a" Nov 22 02:58:27 crc kubenswrapper[4952]: E1122 02:58:27.697433 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"886b97d2927f0e02be003bca48f6fba44a03ba89f8b5bd935fb49623c811e22a\": container with ID starting with 886b97d2927f0e02be003bca48f6fba44a03ba89f8b5bd935fb49623c811e22a not found: ID does not exist" containerID="886b97d2927f0e02be003bca48f6fba44a03ba89f8b5bd935fb49623c811e22a" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.697450 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"886b97d2927f0e02be003bca48f6fba44a03ba89f8b5bd935fb49623c811e22a"} err="failed to get container status \"886b97d2927f0e02be003bca48f6fba44a03ba89f8b5bd935fb49623c811e22a\": rpc error: code = NotFound desc = could not find container \"886b97d2927f0e02be003bca48f6fba44a03ba89f8b5bd935fb49623c811e22a\": container with ID starting with 886b97d2927f0e02be003bca48f6fba44a03ba89f8b5bd935fb49623c811e22a not found: ID does not exist" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.697463 4952 scope.go:117] "RemoveContainer" containerID="420df8bb6b77d55981e25ae92186bf0fe762923a4b6ac414439dcdde01673b3d" Nov 22 02:58:27 crc kubenswrapper[4952]: E1122 02:58:27.697970 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"420df8bb6b77d55981e25ae92186bf0fe762923a4b6ac414439dcdde01673b3d\": container with ID starting with 420df8bb6b77d55981e25ae92186bf0fe762923a4b6ac414439dcdde01673b3d not found: ID does not exist" containerID="420df8bb6b77d55981e25ae92186bf0fe762923a4b6ac414439dcdde01673b3d" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.698000 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"420df8bb6b77d55981e25ae92186bf0fe762923a4b6ac414439dcdde01673b3d"} err="failed to get container status \"420df8bb6b77d55981e25ae92186bf0fe762923a4b6ac414439dcdde01673b3d\": rpc error: code = NotFound desc = could not find container \"420df8bb6b77d55981e25ae92186bf0fe762923a4b6ac414439dcdde01673b3d\": container with ID starting with 420df8bb6b77d55981e25ae92186bf0fe762923a4b6ac414439dcdde01673b3d not found: ID does not exist" Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.751640 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dg5wr"] Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.774059 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hlwls"] Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.776940 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hlwls"] Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.803846 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kqxwp"] Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.810567 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kqxwp"] Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.821424 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bwg2f"] Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.824063 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bwg2f"] Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.840817 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9g9km"] Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.848393 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9g9km"] Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.853861 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lklxp"] Nov 22 02:58:27 crc kubenswrapper[4952]: I1122 02:58:27.856087 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lklxp"] Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.248212 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d4n4w"] Nov 22 02:58:28 crc kubenswrapper[4952]: E1122 02:58:28.248817 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d87343-7102-459c-a231-294939870dc5" containerName="extract-utilities" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.248836 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d87343-7102-459c-a231-294939870dc5" containerName="extract-utilities" Nov 22 02:58:28 crc kubenswrapper[4952]: E1122 02:58:28.248854 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f424b2-2973-4b6a-99dd-08fd5b237adf" containerName="extract-utilities" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.248863 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f424b2-2973-4b6a-99dd-08fd5b237adf" containerName="extract-utilities" Nov 22 02:58:28 crc kubenswrapper[4952]: E1122 02:58:28.248875 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d87343-7102-459c-a231-294939870dc5" containerName="registry-server" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.248883 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d87343-7102-459c-a231-294939870dc5" containerName="registry-server" Nov 22 02:58:28 crc kubenswrapper[4952]: E1122 02:58:28.248898 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85b3681c-313d-40d1-b1f9-c8410c81dc20" containerName="marketplace-operator" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.248905 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="85b3681c-313d-40d1-b1f9-c8410c81dc20" containerName="marketplace-operator" Nov 22 02:58:28 crc kubenswrapper[4952]: E1122 02:58:28.248918 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b700aff5-fe4e-45d2-840c-b4a5390b6b27" containerName="registry-server" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.248925 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="b700aff5-fe4e-45d2-840c-b4a5390b6b27" containerName="registry-server" Nov 22 02:58:28 crc kubenswrapper[4952]: E1122 02:58:28.248936 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b700aff5-fe4e-45d2-840c-b4a5390b6b27" containerName="extract-content" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.248942 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="b700aff5-fe4e-45d2-840c-b4a5390b6b27" containerName="extract-content" Nov 22 02:58:28 crc kubenswrapper[4952]: E1122 02:58:28.248955 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc00999c-0e40-4bca-b54a-2d416d925514" containerName="extract-content" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.248964 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc00999c-0e40-4bca-b54a-2d416d925514" containerName="extract-content" Nov 22 02:58:28 crc kubenswrapper[4952]: E1122 02:58:28.248974 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f424b2-2973-4b6a-99dd-08fd5b237adf" containerName="extract-content" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.248981 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f424b2-2973-4b6a-99dd-08fd5b237adf" containerName="extract-content" Nov 22 02:58:28 crc kubenswrapper[4952]: E1122 02:58:28.248994 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d87343-7102-459c-a231-294939870dc5" containerName="extract-content" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.249001 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d87343-7102-459c-a231-294939870dc5" containerName="extract-content" Nov 22 02:58:28 crc kubenswrapper[4952]: E1122 02:58:28.249009 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc00999c-0e40-4bca-b54a-2d416d925514" containerName="registry-server" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.249016 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc00999c-0e40-4bca-b54a-2d416d925514" containerName="registry-server" Nov 22 02:58:28 crc kubenswrapper[4952]: E1122 02:58:28.249024 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc00999c-0e40-4bca-b54a-2d416d925514" containerName="extract-utilities" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.249031 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc00999c-0e40-4bca-b54a-2d416d925514" containerName="extract-utilities" Nov 22 02:58:28 crc kubenswrapper[4952]: E1122 02:58:28.249042 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b700aff5-fe4e-45d2-840c-b4a5390b6b27" containerName="extract-utilities" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.249048 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="b700aff5-fe4e-45d2-840c-b4a5390b6b27" containerName="extract-utilities" Nov 22 02:58:28 crc kubenswrapper[4952]: E1122 02:58:28.249057 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f424b2-2973-4b6a-99dd-08fd5b237adf" containerName="registry-server" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.249063 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f424b2-2973-4b6a-99dd-08fd5b237adf" containerName="registry-server" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.249170 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4f424b2-2973-4b6a-99dd-08fd5b237adf" containerName="registry-server" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.249185 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="b700aff5-fe4e-45d2-840c-b4a5390b6b27" containerName="registry-server" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.249231 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2d87343-7102-459c-a231-294939870dc5" containerName="registry-server" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.249243 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc00999c-0e40-4bca-b54a-2d416d925514" containerName="registry-server" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.249253 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="85b3681c-313d-40d1-b1f9-c8410c81dc20" containerName="marketplace-operator" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.250159 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d4n4w" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.252913 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.304657 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d4n4w"] Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.398074 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4gx9\" (UniqueName: \"kubernetes.io/projected/8e4baf16-6aef-43f7-85c1-9426a926aae3-kube-api-access-f4gx9\") pod \"certified-operators-d4n4w\" (UID: \"8e4baf16-6aef-43f7-85c1-9426a926aae3\") " pod="openshift-marketplace/certified-operators-d4n4w" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.398136 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e4baf16-6aef-43f7-85c1-9426a926aae3-utilities\") pod \"certified-operators-d4n4w\" (UID: \"8e4baf16-6aef-43f7-85c1-9426a926aae3\") " pod="openshift-marketplace/certified-operators-d4n4w" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.398158 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e4baf16-6aef-43f7-85c1-9426a926aae3-catalog-content\") pod \"certified-operators-d4n4w\" (UID: \"8e4baf16-6aef-43f7-85c1-9426a926aae3\") " pod="openshift-marketplace/certified-operators-d4n4w" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.490323 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dg5wr" event={"ID":"421c7496-b72b-4558-8064-39b4578d0cda","Type":"ContainerStarted","Data":"b46f32f6cfee2904c3b0fd52fc88ab6df53f9957b8f50b5db88983c9f45ca0f3"} Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.490380 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dg5wr" event={"ID":"421c7496-b72b-4558-8064-39b4578d0cda","Type":"ContainerStarted","Data":"d25f07926bbd99dc87704bb62ea3facaaf92547fc0809e470b543ddd82fcc272"} Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.490404 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dg5wr" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.496730 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dg5wr" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.499321 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4gx9\" (UniqueName: \"kubernetes.io/projected/8e4baf16-6aef-43f7-85c1-9426a926aae3-kube-api-access-f4gx9\") pod \"certified-operators-d4n4w\" (UID: \"8e4baf16-6aef-43f7-85c1-9426a926aae3\") " pod="openshift-marketplace/certified-operators-d4n4w" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.499384 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e4baf16-6aef-43f7-85c1-9426a926aae3-utilities\") pod \"certified-operators-d4n4w\" (UID: \"8e4baf16-6aef-43f7-85c1-9426a926aae3\") " pod="openshift-marketplace/certified-operators-d4n4w" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.499408 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e4baf16-6aef-43f7-85c1-9426a926aae3-catalog-content\") pod \"certified-operators-d4n4w\" (UID: \"8e4baf16-6aef-43f7-85c1-9426a926aae3\") " pod="openshift-marketplace/certified-operators-d4n4w" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.500019 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e4baf16-6aef-43f7-85c1-9426a926aae3-catalog-content\") pod \"certified-operators-d4n4w\" (UID: \"8e4baf16-6aef-43f7-85c1-9426a926aae3\") " pod="openshift-marketplace/certified-operators-d4n4w" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.500230 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e4baf16-6aef-43f7-85c1-9426a926aae3-utilities\") pod \"certified-operators-d4n4w\" (UID: \"8e4baf16-6aef-43f7-85c1-9426a926aae3\") " pod="openshift-marketplace/certified-operators-d4n4w" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.513347 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-dg5wr" podStartSLOduration=2.513324962 podStartE2EDuration="2.513324962s" podCreationTimestamp="2025-11-22 02:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:58:28.509781589 +0000 UTC m=+272.815798862" watchObservedRunningTime="2025-11-22 02:58:28.513324962 +0000 UTC m=+272.819342235" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.523418 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4gx9\" (UniqueName: \"kubernetes.io/projected/8e4baf16-6aef-43f7-85c1-9426a926aae3-kube-api-access-f4gx9\") pod \"certified-operators-d4n4w\" (UID: \"8e4baf16-6aef-43f7-85c1-9426a926aae3\") " pod="openshift-marketplace/certified-operators-d4n4w" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.542123 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85b3681c-313d-40d1-b1f9-c8410c81dc20" path="/var/lib/kubelet/pods/85b3681c-313d-40d1-b1f9-c8410c81dc20/volumes" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.543126 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4f424b2-2973-4b6a-99dd-08fd5b237adf" path="/var/lib/kubelet/pods/a4f424b2-2973-4b6a-99dd-08fd5b237adf/volumes" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.544095 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b700aff5-fe4e-45d2-840c-b4a5390b6b27" path="/var/lib/kubelet/pods/b700aff5-fe4e-45d2-840c-b4a5390b6b27/volumes" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.545949 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2d87343-7102-459c-a231-294939870dc5" path="/var/lib/kubelet/pods/d2d87343-7102-459c-a231-294939870dc5/volumes" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.546818 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc00999c-0e40-4bca-b54a-2d416d925514" path="/var/lib/kubelet/pods/fc00999c-0e40-4bca-b54a-2d416d925514/volumes" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.564131 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d4n4w" Nov 22 02:58:28 crc kubenswrapper[4952]: I1122 02:58:28.767137 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d4n4w"] Nov 22 02:58:28 crc kubenswrapper[4952]: W1122 02:58:28.779037 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e4baf16_6aef_43f7_85c1_9426a926aae3.slice/crio-a1c48ef3df6be76bb358f777482042154abf06e456e4c015b3908d94d6735fd2 WatchSource:0}: Error finding container a1c48ef3df6be76bb358f777482042154abf06e456e4c015b3908d94d6735fd2: Status 404 returned error can't find the container with id a1c48ef3df6be76bb358f777482042154abf06e456e4c015b3908d94d6735fd2 Nov 22 02:58:29 crc kubenswrapper[4952]: I1122 02:58:29.248437 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nhtvn"] Nov 22 02:58:29 crc kubenswrapper[4952]: I1122 02:58:29.250513 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nhtvn" Nov 22 02:58:29 crc kubenswrapper[4952]: I1122 02:58:29.253366 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 22 02:58:29 crc kubenswrapper[4952]: I1122 02:58:29.263237 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nhtvn"] Nov 22 02:58:29 crc kubenswrapper[4952]: I1122 02:58:29.412925 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb648a4c-177a-43a5-a924-0e0d60ad85ae-utilities\") pod \"community-operators-nhtvn\" (UID: \"cb648a4c-177a-43a5-a924-0e0d60ad85ae\") " pod="openshift-marketplace/community-operators-nhtvn" Nov 22 02:58:29 crc kubenswrapper[4952]: I1122 02:58:29.413170 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4fnj\" (UniqueName: \"kubernetes.io/projected/cb648a4c-177a-43a5-a924-0e0d60ad85ae-kube-api-access-g4fnj\") pod \"community-operators-nhtvn\" (UID: \"cb648a4c-177a-43a5-a924-0e0d60ad85ae\") " pod="openshift-marketplace/community-operators-nhtvn" Nov 22 02:58:29 crc kubenswrapper[4952]: I1122 02:58:29.413502 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb648a4c-177a-43a5-a924-0e0d60ad85ae-catalog-content\") pod \"community-operators-nhtvn\" (UID: \"cb648a4c-177a-43a5-a924-0e0d60ad85ae\") " pod="openshift-marketplace/community-operators-nhtvn" Nov 22 02:58:29 crc kubenswrapper[4952]: I1122 02:58:29.515116 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb648a4c-177a-43a5-a924-0e0d60ad85ae-catalog-content\") pod \"community-operators-nhtvn\" (UID: \"cb648a4c-177a-43a5-a924-0e0d60ad85ae\") " pod="openshift-marketplace/community-operators-nhtvn" Nov 22 02:58:29 crc kubenswrapper[4952]: I1122 02:58:29.515256 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb648a4c-177a-43a5-a924-0e0d60ad85ae-utilities\") pod \"community-operators-nhtvn\" (UID: \"cb648a4c-177a-43a5-a924-0e0d60ad85ae\") " pod="openshift-marketplace/community-operators-nhtvn" Nov 22 02:58:29 crc kubenswrapper[4952]: I1122 02:58:29.515343 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4fnj\" (UniqueName: \"kubernetes.io/projected/cb648a4c-177a-43a5-a924-0e0d60ad85ae-kube-api-access-g4fnj\") pod \"community-operators-nhtvn\" (UID: \"cb648a4c-177a-43a5-a924-0e0d60ad85ae\") " pod="openshift-marketplace/community-operators-nhtvn" Nov 22 02:58:29 crc kubenswrapper[4952]: I1122 02:58:29.515699 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb648a4c-177a-43a5-a924-0e0d60ad85ae-catalog-content\") pod \"community-operators-nhtvn\" (UID: \"cb648a4c-177a-43a5-a924-0e0d60ad85ae\") " pod="openshift-marketplace/community-operators-nhtvn" Nov 22 02:58:29 crc kubenswrapper[4952]: I1122 02:58:29.516102 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb648a4c-177a-43a5-a924-0e0d60ad85ae-utilities\") pod \"community-operators-nhtvn\" (UID: \"cb648a4c-177a-43a5-a924-0e0d60ad85ae\") " pod="openshift-marketplace/community-operators-nhtvn" Nov 22 02:58:29 crc kubenswrapper[4952]: I1122 02:58:29.516515 4952 generic.go:334] "Generic (PLEG): container finished" podID="8e4baf16-6aef-43f7-85c1-9426a926aae3" containerID="6cf3710a3a1bdb6320b3455e881ec719419840c28aded949c3934dab36cb9465" exitCode=0 Nov 22 02:58:29 crc kubenswrapper[4952]: I1122 02:58:29.517646 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d4n4w" event={"ID":"8e4baf16-6aef-43f7-85c1-9426a926aae3","Type":"ContainerDied","Data":"6cf3710a3a1bdb6320b3455e881ec719419840c28aded949c3934dab36cb9465"} Nov 22 02:58:29 crc kubenswrapper[4952]: I1122 02:58:29.517687 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d4n4w" event={"ID":"8e4baf16-6aef-43f7-85c1-9426a926aae3","Type":"ContainerStarted","Data":"a1c48ef3df6be76bb358f777482042154abf06e456e4c015b3908d94d6735fd2"} Nov 22 02:58:29 crc kubenswrapper[4952]: I1122 02:58:29.540108 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4fnj\" (UniqueName: \"kubernetes.io/projected/cb648a4c-177a-43a5-a924-0e0d60ad85ae-kube-api-access-g4fnj\") pod \"community-operators-nhtvn\" (UID: \"cb648a4c-177a-43a5-a924-0e0d60ad85ae\") " pod="openshift-marketplace/community-operators-nhtvn" Nov 22 02:58:29 crc kubenswrapper[4952]: I1122 02:58:29.617331 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nhtvn" Nov 22 02:58:29 crc kubenswrapper[4952]: I1122 02:58:29.862083 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nhtvn"] Nov 22 02:58:30 crc kubenswrapper[4952]: I1122 02:58:30.527606 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d4n4w" event={"ID":"8e4baf16-6aef-43f7-85c1-9426a926aae3","Type":"ContainerStarted","Data":"8164dd2d637bdb57161924c860303fb786a3c620b933a8451ca80da0112117ed"} Nov 22 02:58:30 crc kubenswrapper[4952]: I1122 02:58:30.531733 4952 generic.go:334] "Generic (PLEG): container finished" podID="cb648a4c-177a-43a5-a924-0e0d60ad85ae" containerID="5822763e5c8dcf5102699060eb0e05a3ddfb25b820f388f7c244e5bf1b651b45" exitCode=0 Nov 22 02:58:30 crc kubenswrapper[4952]: I1122 02:58:30.544506 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhtvn" event={"ID":"cb648a4c-177a-43a5-a924-0e0d60ad85ae","Type":"ContainerDied","Data":"5822763e5c8dcf5102699060eb0e05a3ddfb25b820f388f7c244e5bf1b651b45"} Nov 22 02:58:30 crc kubenswrapper[4952]: I1122 02:58:30.544576 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhtvn" event={"ID":"cb648a4c-177a-43a5-a924-0e0d60ad85ae","Type":"ContainerStarted","Data":"a4a735e1517cd8dca10cc4f6fab2a1118a924502031284ad6be16cb2d3d28ef6"} Nov 22 02:58:30 crc kubenswrapper[4952]: I1122 02:58:30.659062 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gfmzc"] Nov 22 02:58:30 crc kubenswrapper[4952]: I1122 02:58:30.660302 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gfmzc" Nov 22 02:58:30 crc kubenswrapper[4952]: I1122 02:58:30.665040 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 22 02:58:30 crc kubenswrapper[4952]: I1122 02:58:30.665443 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gfmzc"] Nov 22 02:58:30 crc kubenswrapper[4952]: I1122 02:58:30.742882 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btp4m\" (UniqueName: \"kubernetes.io/projected/d5ef5937-ec8f-4c34-a7a7-9b49399ef460-kube-api-access-btp4m\") pod \"redhat-marketplace-gfmzc\" (UID: \"d5ef5937-ec8f-4c34-a7a7-9b49399ef460\") " pod="openshift-marketplace/redhat-marketplace-gfmzc" Nov 22 02:58:30 crc kubenswrapper[4952]: I1122 02:58:30.742999 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5ef5937-ec8f-4c34-a7a7-9b49399ef460-catalog-content\") pod \"redhat-marketplace-gfmzc\" (UID: \"d5ef5937-ec8f-4c34-a7a7-9b49399ef460\") " pod="openshift-marketplace/redhat-marketplace-gfmzc" Nov 22 02:58:30 crc kubenswrapper[4952]: I1122 02:58:30.743042 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5ef5937-ec8f-4c34-a7a7-9b49399ef460-utilities\") pod \"redhat-marketplace-gfmzc\" (UID: \"d5ef5937-ec8f-4c34-a7a7-9b49399ef460\") " pod="openshift-marketplace/redhat-marketplace-gfmzc" Nov 22 02:58:30 crc kubenswrapper[4952]: I1122 02:58:30.843439 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5ef5937-ec8f-4c34-a7a7-9b49399ef460-utilities\") pod \"redhat-marketplace-gfmzc\" (UID: \"d5ef5937-ec8f-4c34-a7a7-9b49399ef460\") " pod="openshift-marketplace/redhat-marketplace-gfmzc" Nov 22 02:58:30 crc kubenswrapper[4952]: I1122 02:58:30.843525 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btp4m\" (UniqueName: \"kubernetes.io/projected/d5ef5937-ec8f-4c34-a7a7-9b49399ef460-kube-api-access-btp4m\") pod \"redhat-marketplace-gfmzc\" (UID: \"d5ef5937-ec8f-4c34-a7a7-9b49399ef460\") " pod="openshift-marketplace/redhat-marketplace-gfmzc" Nov 22 02:58:30 crc kubenswrapper[4952]: I1122 02:58:30.843602 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5ef5937-ec8f-4c34-a7a7-9b49399ef460-catalog-content\") pod \"redhat-marketplace-gfmzc\" (UID: \"d5ef5937-ec8f-4c34-a7a7-9b49399ef460\") " pod="openshift-marketplace/redhat-marketplace-gfmzc" Nov 22 02:58:30 crc kubenswrapper[4952]: I1122 02:58:30.844118 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5ef5937-ec8f-4c34-a7a7-9b49399ef460-utilities\") pod \"redhat-marketplace-gfmzc\" (UID: \"d5ef5937-ec8f-4c34-a7a7-9b49399ef460\") " pod="openshift-marketplace/redhat-marketplace-gfmzc" Nov 22 02:58:30 crc kubenswrapper[4952]: I1122 02:58:30.844223 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5ef5937-ec8f-4c34-a7a7-9b49399ef460-catalog-content\") pod \"redhat-marketplace-gfmzc\" (UID: \"d5ef5937-ec8f-4c34-a7a7-9b49399ef460\") " pod="openshift-marketplace/redhat-marketplace-gfmzc" Nov 22 02:58:30 crc kubenswrapper[4952]: I1122 02:58:30.881208 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btp4m\" (UniqueName: \"kubernetes.io/projected/d5ef5937-ec8f-4c34-a7a7-9b49399ef460-kube-api-access-btp4m\") pod \"redhat-marketplace-gfmzc\" (UID: \"d5ef5937-ec8f-4c34-a7a7-9b49399ef460\") " pod="openshift-marketplace/redhat-marketplace-gfmzc" Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.048685 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gfmzc" Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.281389 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gfmzc"] Nov 22 02:58:31 crc kubenswrapper[4952]: W1122 02:58:31.292733 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5ef5937_ec8f_4c34_a7a7_9b49399ef460.slice/crio-824ae1b99dcd170d53cef0ce35c3951078c869665b0fc10ccf8bcfb046c960ea WatchSource:0}: Error finding container 824ae1b99dcd170d53cef0ce35c3951078c869665b0fc10ccf8bcfb046c960ea: Status 404 returned error can't find the container with id 824ae1b99dcd170d53cef0ce35c3951078c869665b0fc10ccf8bcfb046c960ea Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.538822 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhtvn" event={"ID":"cb648a4c-177a-43a5-a924-0e0d60ad85ae","Type":"ContainerStarted","Data":"c4878eb02ed8cd8419c3e1707bd0f6bdb4c249f70edf0a99a7de1b662449442d"} Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.540765 4952 generic.go:334] "Generic (PLEG): container finished" podID="8e4baf16-6aef-43f7-85c1-9426a926aae3" containerID="8164dd2d637bdb57161924c860303fb786a3c620b933a8451ca80da0112117ed" exitCode=0 Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.540847 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d4n4w" event={"ID":"8e4baf16-6aef-43f7-85c1-9426a926aae3","Type":"ContainerDied","Data":"8164dd2d637bdb57161924c860303fb786a3c620b933a8451ca80da0112117ed"} Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.543820 4952 generic.go:334] "Generic (PLEG): container finished" podID="d5ef5937-ec8f-4c34-a7a7-9b49399ef460" containerID="083fb859893f25543b81c47ad8a3a99b87b9d4e9ab5f7aba1ceafb2f72829462" exitCode=0 Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.543884 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gfmzc" event={"ID":"d5ef5937-ec8f-4c34-a7a7-9b49399ef460","Type":"ContainerDied","Data":"083fb859893f25543b81c47ad8a3a99b87b9d4e9ab5f7aba1ceafb2f72829462"} Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.543911 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gfmzc" event={"ID":"d5ef5937-ec8f-4c34-a7a7-9b49399ef460","Type":"ContainerStarted","Data":"824ae1b99dcd170d53cef0ce35c3951078c869665b0fc10ccf8bcfb046c960ea"} Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.649282 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t5rgp"] Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.658369 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5rgp" Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.660914 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.661456 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgqbj\" (UniqueName: \"kubernetes.io/projected/7be8e929-2755-4dea-9693-37a6d1a099bb-kube-api-access-xgqbj\") pod \"redhat-operators-t5rgp\" (UID: \"7be8e929-2755-4dea-9693-37a6d1a099bb\") " pod="openshift-marketplace/redhat-operators-t5rgp" Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.661496 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7be8e929-2755-4dea-9693-37a6d1a099bb-utilities\") pod \"redhat-operators-t5rgp\" (UID: \"7be8e929-2755-4dea-9693-37a6d1a099bb\") " pod="openshift-marketplace/redhat-operators-t5rgp" Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.661524 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7be8e929-2755-4dea-9693-37a6d1a099bb-catalog-content\") pod \"redhat-operators-t5rgp\" (UID: \"7be8e929-2755-4dea-9693-37a6d1a099bb\") " pod="openshift-marketplace/redhat-operators-t5rgp" Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.663023 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t5rgp"] Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.762944 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgqbj\" (UniqueName: \"kubernetes.io/projected/7be8e929-2755-4dea-9693-37a6d1a099bb-kube-api-access-xgqbj\") pod \"redhat-operators-t5rgp\" (UID: \"7be8e929-2755-4dea-9693-37a6d1a099bb\") " pod="openshift-marketplace/redhat-operators-t5rgp" Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.762998 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7be8e929-2755-4dea-9693-37a6d1a099bb-utilities\") pod \"redhat-operators-t5rgp\" (UID: \"7be8e929-2755-4dea-9693-37a6d1a099bb\") " pod="openshift-marketplace/redhat-operators-t5rgp" Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.763036 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7be8e929-2755-4dea-9693-37a6d1a099bb-catalog-content\") pod \"redhat-operators-t5rgp\" (UID: \"7be8e929-2755-4dea-9693-37a6d1a099bb\") " pod="openshift-marketplace/redhat-operators-t5rgp" Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.763582 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7be8e929-2755-4dea-9693-37a6d1a099bb-catalog-content\") pod \"redhat-operators-t5rgp\" (UID: \"7be8e929-2755-4dea-9693-37a6d1a099bb\") " pod="openshift-marketplace/redhat-operators-t5rgp" Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.763617 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7be8e929-2755-4dea-9693-37a6d1a099bb-utilities\") pod \"redhat-operators-t5rgp\" (UID: \"7be8e929-2755-4dea-9693-37a6d1a099bb\") " pod="openshift-marketplace/redhat-operators-t5rgp" Nov 22 02:58:31 crc kubenswrapper[4952]: I1122 02:58:31.785940 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgqbj\" (UniqueName: \"kubernetes.io/projected/7be8e929-2755-4dea-9693-37a6d1a099bb-kube-api-access-xgqbj\") pod \"redhat-operators-t5rgp\" (UID: \"7be8e929-2755-4dea-9693-37a6d1a099bb\") " pod="openshift-marketplace/redhat-operators-t5rgp" Nov 22 02:58:32 crc kubenswrapper[4952]: I1122 02:58:32.005116 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5rgp" Nov 22 02:58:32 crc kubenswrapper[4952]: I1122 02:58:32.267460 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t5rgp"] Nov 22 02:58:32 crc kubenswrapper[4952]: I1122 02:58:32.550633 4952 generic.go:334] "Generic (PLEG): container finished" podID="cb648a4c-177a-43a5-a924-0e0d60ad85ae" containerID="c4878eb02ed8cd8419c3e1707bd0f6bdb4c249f70edf0a99a7de1b662449442d" exitCode=0 Nov 22 02:58:32 crc kubenswrapper[4952]: I1122 02:58:32.550740 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhtvn" event={"ID":"cb648a4c-177a-43a5-a924-0e0d60ad85ae","Type":"ContainerDied","Data":"c4878eb02ed8cd8419c3e1707bd0f6bdb4c249f70edf0a99a7de1b662449442d"} Nov 22 02:58:32 crc kubenswrapper[4952]: I1122 02:58:32.557369 4952 generic.go:334] "Generic (PLEG): container finished" podID="7be8e929-2755-4dea-9693-37a6d1a099bb" containerID="5ed9b5790dc0358413317cbd869633b0fa9149a4f97f2ecaf538d535a98cffec" exitCode=0 Nov 22 02:58:32 crc kubenswrapper[4952]: I1122 02:58:32.557442 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5rgp" event={"ID":"7be8e929-2755-4dea-9693-37a6d1a099bb","Type":"ContainerDied","Data":"5ed9b5790dc0358413317cbd869633b0fa9149a4f97f2ecaf538d535a98cffec"} Nov 22 02:58:32 crc kubenswrapper[4952]: I1122 02:58:32.557469 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5rgp" event={"ID":"7be8e929-2755-4dea-9693-37a6d1a099bb","Type":"ContainerStarted","Data":"cbeead1bf07f691c41219eee77036825de4cda899b320caf99b5895a3554964e"} Nov 22 02:58:32 crc kubenswrapper[4952]: I1122 02:58:32.569737 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d4n4w" event={"ID":"8e4baf16-6aef-43f7-85c1-9426a926aae3","Type":"ContainerStarted","Data":"52451a15dec310a9fb6eaebd0b207389496f544465f8273a25ce2f4212f93d7f"} Nov 22 02:58:32 crc kubenswrapper[4952]: I1122 02:58:32.597302 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d4n4w" podStartSLOduration=2.093109283 podStartE2EDuration="4.597278969s" podCreationTimestamp="2025-11-22 02:58:28 +0000 UTC" firstStartedPulling="2025-11-22 02:58:29.520404137 +0000 UTC m=+273.826421420" lastFinishedPulling="2025-11-22 02:58:32.024573833 +0000 UTC m=+276.330591106" observedRunningTime="2025-11-22 02:58:32.596135317 +0000 UTC m=+276.902152590" watchObservedRunningTime="2025-11-22 02:58:32.597278969 +0000 UTC m=+276.903296242" Nov 22 02:58:33 crc kubenswrapper[4952]: I1122 02:58:33.577295 4952 generic.go:334] "Generic (PLEG): container finished" podID="d5ef5937-ec8f-4c34-a7a7-9b49399ef460" containerID="7ecc69a59cc809696b4bfaadd3f7e9c25f2c5fa9df50151980957e958027f0ed" exitCode=0 Nov 22 02:58:33 crc kubenswrapper[4952]: I1122 02:58:33.577367 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gfmzc" event={"ID":"d5ef5937-ec8f-4c34-a7a7-9b49399ef460","Type":"ContainerDied","Data":"7ecc69a59cc809696b4bfaadd3f7e9c25f2c5fa9df50151980957e958027f0ed"} Nov 22 02:58:33 crc kubenswrapper[4952]: I1122 02:58:33.580941 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhtvn" event={"ID":"cb648a4c-177a-43a5-a924-0e0d60ad85ae","Type":"ContainerStarted","Data":"f780e8318c331dce4bbac5d6762f6154c07550ad6cb89e575827b249c05663f6"} Nov 22 02:58:33 crc kubenswrapper[4952]: I1122 02:58:33.587928 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5rgp" event={"ID":"7be8e929-2755-4dea-9693-37a6d1a099bb","Type":"ContainerStarted","Data":"ec8fbc51ee58781911bd4b3d1d1ca6817e4283cc9a5f5036753210cfb763b983"} Nov 22 02:58:33 crc kubenswrapper[4952]: I1122 02:58:33.618968 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nhtvn" podStartSLOduration=2.148815022 podStartE2EDuration="4.618939576s" podCreationTimestamp="2025-11-22 02:58:29 +0000 UTC" firstStartedPulling="2025-11-22 02:58:30.533360001 +0000 UTC m=+274.839377274" lastFinishedPulling="2025-11-22 02:58:33.003484555 +0000 UTC m=+277.309501828" observedRunningTime="2025-11-22 02:58:33.617910136 +0000 UTC m=+277.923927409" watchObservedRunningTime="2025-11-22 02:58:33.618939576 +0000 UTC m=+277.924956849" Nov 22 02:58:34 crc kubenswrapper[4952]: I1122 02:58:34.604756 4952 generic.go:334] "Generic (PLEG): container finished" podID="7be8e929-2755-4dea-9693-37a6d1a099bb" containerID="ec8fbc51ee58781911bd4b3d1d1ca6817e4283cc9a5f5036753210cfb763b983" exitCode=0 Nov 22 02:58:34 crc kubenswrapper[4952]: I1122 02:58:34.604822 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5rgp" event={"ID":"7be8e929-2755-4dea-9693-37a6d1a099bb","Type":"ContainerDied","Data":"ec8fbc51ee58781911bd4b3d1d1ca6817e4283cc9a5f5036753210cfb763b983"} Nov 22 02:58:34 crc kubenswrapper[4952]: I1122 02:58:34.610877 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gfmzc" event={"ID":"d5ef5937-ec8f-4c34-a7a7-9b49399ef460","Type":"ContainerStarted","Data":"167e305d9fa1d1f181085d11e96b599b400a0c4df770d216fb7637cf3b8258d7"} Nov 22 02:58:36 crc kubenswrapper[4952]: I1122 02:58:36.627310 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5rgp" event={"ID":"7be8e929-2755-4dea-9693-37a6d1a099bb","Type":"ContainerStarted","Data":"126a3043673d4edc9cd3049ee755fe16dd4bde4f54aa1e6cc42c520f4159b20b"} Nov 22 02:58:36 crc kubenswrapper[4952]: I1122 02:58:36.648725 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t5rgp" podStartSLOduration=3.221266378 podStartE2EDuration="5.648703477s" podCreationTimestamp="2025-11-22 02:58:31 +0000 UTC" firstStartedPulling="2025-11-22 02:58:32.56544492 +0000 UTC m=+276.871462193" lastFinishedPulling="2025-11-22 02:58:34.992882019 +0000 UTC m=+279.298899292" observedRunningTime="2025-11-22 02:58:36.646652018 +0000 UTC m=+280.952669311" watchObservedRunningTime="2025-11-22 02:58:36.648703477 +0000 UTC m=+280.954720750" Nov 22 02:58:36 crc kubenswrapper[4952]: I1122 02:58:36.651173 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gfmzc" podStartSLOduration=4.237158876 podStartE2EDuration="6.651159648s" podCreationTimestamp="2025-11-22 02:58:30 +0000 UTC" firstStartedPulling="2025-11-22 02:58:31.545294437 +0000 UTC m=+275.851311710" lastFinishedPulling="2025-11-22 02:58:33.959295209 +0000 UTC m=+278.265312482" observedRunningTime="2025-11-22 02:58:34.663261217 +0000 UTC m=+278.969278490" watchObservedRunningTime="2025-11-22 02:58:36.651159648 +0000 UTC m=+280.957176921" Nov 22 02:58:38 crc kubenswrapper[4952]: I1122 02:58:38.565062 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d4n4w" Nov 22 02:58:38 crc kubenswrapper[4952]: I1122 02:58:38.565600 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d4n4w" Nov 22 02:58:38 crc kubenswrapper[4952]: I1122 02:58:38.604581 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d4n4w" Nov 22 02:58:38 crc kubenswrapper[4952]: I1122 02:58:38.683989 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d4n4w" Nov 22 02:58:39 crc kubenswrapper[4952]: I1122 02:58:39.617844 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nhtvn" Nov 22 02:58:39 crc kubenswrapper[4952]: I1122 02:58:39.618333 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nhtvn" Nov 22 02:58:39 crc kubenswrapper[4952]: I1122 02:58:39.666926 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nhtvn" Nov 22 02:58:39 crc kubenswrapper[4952]: I1122 02:58:39.725323 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nhtvn" Nov 22 02:58:41 crc kubenswrapper[4952]: I1122 02:58:41.050693 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gfmzc" Nov 22 02:58:41 crc kubenswrapper[4952]: I1122 02:58:41.050748 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gfmzc" Nov 22 02:58:41 crc kubenswrapper[4952]: I1122 02:58:41.094523 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gfmzc" Nov 22 02:58:41 crc kubenswrapper[4952]: I1122 02:58:41.717995 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gfmzc" Nov 22 02:58:42 crc kubenswrapper[4952]: I1122 02:58:42.006380 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t5rgp" Nov 22 02:58:42 crc kubenswrapper[4952]: I1122 02:58:42.006462 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t5rgp" Nov 22 02:58:43 crc kubenswrapper[4952]: I1122 02:58:43.065295 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t5rgp" podUID="7be8e929-2755-4dea-9693-37a6d1a099bb" containerName="registry-server" probeResult="failure" output=< Nov 22 02:58:43 crc kubenswrapper[4952]: timeout: failed to connect service ":50051" within 1s Nov 22 02:58:43 crc kubenswrapper[4952]: > Nov 22 02:58:52 crc kubenswrapper[4952]: I1122 02:58:52.070376 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t5rgp" Nov 22 02:58:52 crc kubenswrapper[4952]: I1122 02:58:52.123848 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t5rgp" Nov 22 02:59:58 crc kubenswrapper[4952]: I1122 02:59:58.342224 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 02:59:58 crc kubenswrapper[4952]: I1122 02:59:58.342906 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:00:00 crc kubenswrapper[4952]: I1122 03:00:00.154389 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p"] Nov 22 03:00:00 crc kubenswrapper[4952]: I1122 03:00:00.157456 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p" Nov 22 03:00:00 crc kubenswrapper[4952]: I1122 03:00:00.162131 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 03:00:00 crc kubenswrapper[4952]: I1122 03:00:00.162225 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 03:00:00 crc kubenswrapper[4952]: I1122 03:00:00.179557 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p"] Nov 22 03:00:00 crc kubenswrapper[4952]: I1122 03:00:00.272444 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc1b83df-c857-4b28-beab-c05670fe1b0b-config-volume\") pod \"collect-profiles-29396340-ksr4p\" (UID: \"bc1b83df-c857-4b28-beab-c05670fe1b0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p" Nov 22 03:00:00 crc kubenswrapper[4952]: I1122 03:00:00.272493 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69d59\" (UniqueName: \"kubernetes.io/projected/bc1b83df-c857-4b28-beab-c05670fe1b0b-kube-api-access-69d59\") pod \"collect-profiles-29396340-ksr4p\" (UID: \"bc1b83df-c857-4b28-beab-c05670fe1b0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p" Nov 22 03:00:00 crc kubenswrapper[4952]: I1122 03:00:00.272685 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc1b83df-c857-4b28-beab-c05670fe1b0b-secret-volume\") pod \"collect-profiles-29396340-ksr4p\" (UID: \"bc1b83df-c857-4b28-beab-c05670fe1b0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p" Nov 22 03:00:00 crc kubenswrapper[4952]: I1122 03:00:00.374513 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc1b83df-c857-4b28-beab-c05670fe1b0b-secret-volume\") pod \"collect-profiles-29396340-ksr4p\" (UID: \"bc1b83df-c857-4b28-beab-c05670fe1b0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p" Nov 22 03:00:00 crc kubenswrapper[4952]: I1122 03:00:00.374844 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc1b83df-c857-4b28-beab-c05670fe1b0b-config-volume\") pod \"collect-profiles-29396340-ksr4p\" (UID: \"bc1b83df-c857-4b28-beab-c05670fe1b0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p" Nov 22 03:00:00 crc kubenswrapper[4952]: I1122 03:00:00.375721 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69d59\" (UniqueName: \"kubernetes.io/projected/bc1b83df-c857-4b28-beab-c05670fe1b0b-kube-api-access-69d59\") pod \"collect-profiles-29396340-ksr4p\" (UID: \"bc1b83df-c857-4b28-beab-c05670fe1b0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p" Nov 22 03:00:00 crc kubenswrapper[4952]: I1122 03:00:00.375929 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc1b83df-c857-4b28-beab-c05670fe1b0b-config-volume\") pod \"collect-profiles-29396340-ksr4p\" (UID: \"bc1b83df-c857-4b28-beab-c05670fe1b0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p" Nov 22 03:00:00 crc kubenswrapper[4952]: I1122 03:00:00.381773 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc1b83df-c857-4b28-beab-c05670fe1b0b-secret-volume\") pod \"collect-profiles-29396340-ksr4p\" (UID: \"bc1b83df-c857-4b28-beab-c05670fe1b0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p" Nov 22 03:00:00 crc kubenswrapper[4952]: I1122 03:00:00.393891 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69d59\" (UniqueName: \"kubernetes.io/projected/bc1b83df-c857-4b28-beab-c05670fe1b0b-kube-api-access-69d59\") pod \"collect-profiles-29396340-ksr4p\" (UID: \"bc1b83df-c857-4b28-beab-c05670fe1b0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p" Nov 22 03:00:00 crc kubenswrapper[4952]: I1122 03:00:00.477944 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p" Nov 22 03:00:00 crc kubenswrapper[4952]: I1122 03:00:00.671689 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p"] Nov 22 03:00:00 crc kubenswrapper[4952]: E1122 03:00:00.929842 4952 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc1b83df_c857_4b28_beab_c05670fe1b0b.slice/crio-7546d98f14d72669632d1a7dede418be0031153b6253bbb381f71b05b1386da5.scope\": RecentStats: unable to find data in memory cache]" Nov 22 03:00:01 crc kubenswrapper[4952]: I1122 03:00:01.206599 4952 generic.go:334] "Generic (PLEG): container finished" podID="bc1b83df-c857-4b28-beab-c05670fe1b0b" containerID="7546d98f14d72669632d1a7dede418be0031153b6253bbb381f71b05b1386da5" exitCode=0 Nov 22 03:00:01 crc kubenswrapper[4952]: I1122 03:00:01.206659 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p" event={"ID":"bc1b83df-c857-4b28-beab-c05670fe1b0b","Type":"ContainerDied","Data":"7546d98f14d72669632d1a7dede418be0031153b6253bbb381f71b05b1386da5"} Nov 22 03:00:01 crc kubenswrapper[4952]: I1122 03:00:01.206693 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p" event={"ID":"bc1b83df-c857-4b28-beab-c05670fe1b0b","Type":"ContainerStarted","Data":"0672a160209650df2e143560bf1906d02f914b45e1eb72483534e875f018d9f4"} Nov 22 03:00:02 crc kubenswrapper[4952]: I1122 03:00:02.511005 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p" Nov 22 03:00:02 crc kubenswrapper[4952]: I1122 03:00:02.630058 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69d59\" (UniqueName: \"kubernetes.io/projected/bc1b83df-c857-4b28-beab-c05670fe1b0b-kube-api-access-69d59\") pod \"bc1b83df-c857-4b28-beab-c05670fe1b0b\" (UID: \"bc1b83df-c857-4b28-beab-c05670fe1b0b\") " Nov 22 03:00:02 crc kubenswrapper[4952]: I1122 03:00:02.630505 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc1b83df-c857-4b28-beab-c05670fe1b0b-config-volume\") pod \"bc1b83df-c857-4b28-beab-c05670fe1b0b\" (UID: \"bc1b83df-c857-4b28-beab-c05670fe1b0b\") " Nov 22 03:00:02 crc kubenswrapper[4952]: I1122 03:00:02.630629 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc1b83df-c857-4b28-beab-c05670fe1b0b-secret-volume\") pod \"bc1b83df-c857-4b28-beab-c05670fe1b0b\" (UID: \"bc1b83df-c857-4b28-beab-c05670fe1b0b\") " Nov 22 03:00:02 crc kubenswrapper[4952]: I1122 03:00:02.631033 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc1b83df-c857-4b28-beab-c05670fe1b0b-config-volume" (OuterVolumeSpecName: "config-volume") pod "bc1b83df-c857-4b28-beab-c05670fe1b0b" (UID: "bc1b83df-c857-4b28-beab-c05670fe1b0b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:00:02 crc kubenswrapper[4952]: I1122 03:00:02.631427 4952 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc1b83df-c857-4b28-beab-c05670fe1b0b-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 03:00:02 crc kubenswrapper[4952]: I1122 03:00:02.637794 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc1b83df-c857-4b28-beab-c05670fe1b0b-kube-api-access-69d59" (OuterVolumeSpecName: "kube-api-access-69d59") pod "bc1b83df-c857-4b28-beab-c05670fe1b0b" (UID: "bc1b83df-c857-4b28-beab-c05670fe1b0b"). InnerVolumeSpecName "kube-api-access-69d59". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:00:02 crc kubenswrapper[4952]: I1122 03:00:02.642881 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc1b83df-c857-4b28-beab-c05670fe1b0b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bc1b83df-c857-4b28-beab-c05670fe1b0b" (UID: "bc1b83df-c857-4b28-beab-c05670fe1b0b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:00:02 crc kubenswrapper[4952]: I1122 03:00:02.732336 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69d59\" (UniqueName: \"kubernetes.io/projected/bc1b83df-c857-4b28-beab-c05670fe1b0b-kube-api-access-69d59\") on node \"crc\" DevicePath \"\"" Nov 22 03:00:02 crc kubenswrapper[4952]: I1122 03:00:02.732400 4952 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc1b83df-c857-4b28-beab-c05670fe1b0b-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 03:00:03 crc kubenswrapper[4952]: I1122 03:00:03.221160 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p" event={"ID":"bc1b83df-c857-4b28-beab-c05670fe1b0b","Type":"ContainerDied","Data":"0672a160209650df2e143560bf1906d02f914b45e1eb72483534e875f018d9f4"} Nov 22 03:00:03 crc kubenswrapper[4952]: I1122 03:00:03.221214 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0672a160209650df2e143560bf1906d02f914b45e1eb72483534e875f018d9f4" Nov 22 03:00:03 crc kubenswrapper[4952]: I1122 03:00:03.221335 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p" Nov 22 03:00:28 crc kubenswrapper[4952]: I1122 03:00:28.342956 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:00:28 crc kubenswrapper[4952]: I1122 03:00:28.343682 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:00:58 crc kubenswrapper[4952]: I1122 03:00:58.342195 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:00:58 crc kubenswrapper[4952]: I1122 03:00:58.342941 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:00:58 crc kubenswrapper[4952]: I1122 03:00:58.343004 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 03:00:58 crc kubenswrapper[4952]: I1122 03:00:58.343907 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"283175bce2bea1ef54ec44437a69cc09a90aa62ad36cd51f79f6632b87a3f11a"} pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:00:58 crc kubenswrapper[4952]: I1122 03:00:58.343981 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" containerID="cri-o://283175bce2bea1ef54ec44437a69cc09a90aa62ad36cd51f79f6632b87a3f11a" gracePeriod=600 Nov 22 03:00:58 crc kubenswrapper[4952]: I1122 03:00:58.603015 4952 generic.go:334] "Generic (PLEG): container finished" podID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerID="283175bce2bea1ef54ec44437a69cc09a90aa62ad36cd51f79f6632b87a3f11a" exitCode=0 Nov 22 03:00:58 crc kubenswrapper[4952]: I1122 03:00:58.603159 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerDied","Data":"283175bce2bea1ef54ec44437a69cc09a90aa62ad36cd51f79f6632b87a3f11a"} Nov 22 03:00:58 crc kubenswrapper[4952]: I1122 03:00:58.603614 4952 scope.go:117] "RemoveContainer" containerID="a9e083c9a1dc9afe916a6487b15e61efdca999efc376406d91e37b978d734ea3" Nov 22 03:00:59 crc kubenswrapper[4952]: I1122 03:00:59.616075 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"0cf1c8c9fd6e281870ad88809e9296851217d3eb9921ce023095d72e4315fecb"} Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.773068 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zszd9"] Nov 22 03:02:08 crc kubenswrapper[4952]: E1122 03:02:08.774220 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc1b83df-c857-4b28-beab-c05670fe1b0b" containerName="collect-profiles" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.774256 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc1b83df-c857-4b28-beab-c05670fe1b0b" containerName="collect-profiles" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.774399 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc1b83df-c857-4b28-beab-c05670fe1b0b" containerName="collect-profiles" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.775017 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.787828 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zszd9"] Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.874753 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a03d745-f29b-4eac-959b-390ac9f86a21-bound-sa-token\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.874976 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5a03d745-f29b-4eac-959b-390ac9f86a21-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.875115 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.875166 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a03d745-f29b-4eac-959b-390ac9f86a21-trusted-ca\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.875194 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5a03d745-f29b-4eac-959b-390ac9f86a21-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.875283 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a03d745-f29b-4eac-959b-390ac9f86a21-registry-tls\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.875487 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxs67\" (UniqueName: \"kubernetes.io/projected/5a03d745-f29b-4eac-959b-390ac9f86a21-kube-api-access-mxs67\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.875637 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5a03d745-f29b-4eac-959b-390ac9f86a21-registry-certificates\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.902021 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.977765 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5a03d745-f29b-4eac-959b-390ac9f86a21-registry-certificates\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.977843 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a03d745-f29b-4eac-959b-390ac9f86a21-bound-sa-token\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.977871 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5a03d745-f29b-4eac-959b-390ac9f86a21-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.977911 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a03d745-f29b-4eac-959b-390ac9f86a21-trusted-ca\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.977934 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5a03d745-f29b-4eac-959b-390ac9f86a21-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.978011 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a03d745-f29b-4eac-959b-390ac9f86a21-registry-tls\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.978759 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5a03d745-f29b-4eac-959b-390ac9f86a21-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.978831 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxs67\" (UniqueName: \"kubernetes.io/projected/5a03d745-f29b-4eac-959b-390ac9f86a21-kube-api-access-mxs67\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.979861 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a03d745-f29b-4eac-959b-390ac9f86a21-trusted-ca\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.981390 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5a03d745-f29b-4eac-959b-390ac9f86a21-registry-certificates\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.985929 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a03d745-f29b-4eac-959b-390ac9f86a21-registry-tls\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.992251 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5a03d745-f29b-4eac-959b-390ac9f86a21-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:08 crc kubenswrapper[4952]: I1122 03:02:08.998418 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a03d745-f29b-4eac-959b-390ac9f86a21-bound-sa-token\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:09 crc kubenswrapper[4952]: I1122 03:02:09.000562 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxs67\" (UniqueName: \"kubernetes.io/projected/5a03d745-f29b-4eac-959b-390ac9f86a21-kube-api-access-mxs67\") pod \"image-registry-66df7c8f76-zszd9\" (UID: \"5a03d745-f29b-4eac-959b-390ac9f86a21\") " pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:09 crc kubenswrapper[4952]: I1122 03:02:09.096643 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:09 crc kubenswrapper[4952]: I1122 03:02:09.382619 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zszd9"] Nov 22 03:02:10 crc kubenswrapper[4952]: I1122 03:02:10.119306 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" event={"ID":"5a03d745-f29b-4eac-959b-390ac9f86a21","Type":"ContainerStarted","Data":"10b65fa24c0af4d3a501274b6c72fafac211476d6f5d0809dde515beb1652b75"} Nov 22 03:02:10 crc kubenswrapper[4952]: I1122 03:02:10.119395 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" event={"ID":"5a03d745-f29b-4eac-959b-390ac9f86a21","Type":"ContainerStarted","Data":"f58e348148d9ca7e9a8f013ed741c36e1c56e51e6a53ef89ba276cbdb75187ca"} Nov 22 03:02:10 crc kubenswrapper[4952]: I1122 03:02:10.119493 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:10 crc kubenswrapper[4952]: I1122 03:02:10.152942 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" podStartSLOduration=2.152910445 podStartE2EDuration="2.152910445s" podCreationTimestamp="2025-11-22 03:02:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:02:10.145993657 +0000 UTC m=+494.452010990" watchObservedRunningTime="2025-11-22 03:02:10.152910445 +0000 UTC m=+494.458927748" Nov 22 03:02:29 crc kubenswrapper[4952]: I1122 03:02:29.107652 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-zszd9" Nov 22 03:02:29 crc kubenswrapper[4952]: I1122 03:02:29.183299 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p9bf7"] Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.235496 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" podUID="073b4a27-3e98-4d0d-a2b7-62a89a434907" containerName="registry" containerID="cri-o://0977207663e74354579e8bb1ed76de546ca01a79756275e3ae66ba31b92d4b13" gracePeriod=30 Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.423901 4952 generic.go:334] "Generic (PLEG): container finished" podID="073b4a27-3e98-4d0d-a2b7-62a89a434907" containerID="0977207663e74354579e8bb1ed76de546ca01a79756275e3ae66ba31b92d4b13" exitCode=0 Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.423972 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" event={"ID":"073b4a27-3e98-4d0d-a2b7-62a89a434907","Type":"ContainerDied","Data":"0977207663e74354579e8bb1ed76de546ca01a79756275e3ae66ba31b92d4b13"} Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.625787 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.666511 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/073b4a27-3e98-4d0d-a2b7-62a89a434907-registry-certificates\") pod \"073b4a27-3e98-4d0d-a2b7-62a89a434907\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.667805 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/073b4a27-3e98-4d0d-a2b7-62a89a434907-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "073b4a27-3e98-4d0d-a2b7-62a89a434907" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.668028 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"073b4a27-3e98-4d0d-a2b7-62a89a434907\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.668115 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/073b4a27-3e98-4d0d-a2b7-62a89a434907-registry-tls\") pod \"073b4a27-3e98-4d0d-a2b7-62a89a434907\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.668241 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-594zj\" (UniqueName: \"kubernetes.io/projected/073b4a27-3e98-4d0d-a2b7-62a89a434907-kube-api-access-594zj\") pod \"073b4a27-3e98-4d0d-a2b7-62a89a434907\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.668277 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/073b4a27-3e98-4d0d-a2b7-62a89a434907-trusted-ca\") pod \"073b4a27-3e98-4d0d-a2b7-62a89a434907\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.668360 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/073b4a27-3e98-4d0d-a2b7-62a89a434907-ca-trust-extracted\") pod \"073b4a27-3e98-4d0d-a2b7-62a89a434907\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.668486 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/073b4a27-3e98-4d0d-a2b7-62a89a434907-installation-pull-secrets\") pod \"073b4a27-3e98-4d0d-a2b7-62a89a434907\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.668583 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/073b4a27-3e98-4d0d-a2b7-62a89a434907-bound-sa-token\") pod \"073b4a27-3e98-4d0d-a2b7-62a89a434907\" (UID: \"073b4a27-3e98-4d0d-a2b7-62a89a434907\") " Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.669024 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/073b4a27-3e98-4d0d-a2b7-62a89a434907-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "073b4a27-3e98-4d0d-a2b7-62a89a434907" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.669154 4952 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/073b4a27-3e98-4d0d-a2b7-62a89a434907-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.669187 4952 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/073b4a27-3e98-4d0d-a2b7-62a89a434907-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.676914 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/073b4a27-3e98-4d0d-a2b7-62a89a434907-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "073b4a27-3e98-4d0d-a2b7-62a89a434907" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.677812 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/073b4a27-3e98-4d0d-a2b7-62a89a434907-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "073b4a27-3e98-4d0d-a2b7-62a89a434907" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.677907 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073b4a27-3e98-4d0d-a2b7-62a89a434907-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "073b4a27-3e98-4d0d-a2b7-62a89a434907" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.679459 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/073b4a27-3e98-4d0d-a2b7-62a89a434907-kube-api-access-594zj" (OuterVolumeSpecName: "kube-api-access-594zj") pod "073b4a27-3e98-4d0d-a2b7-62a89a434907" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907"). InnerVolumeSpecName "kube-api-access-594zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.685284 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "073b4a27-3e98-4d0d-a2b7-62a89a434907" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.692348 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/073b4a27-3e98-4d0d-a2b7-62a89a434907-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "073b4a27-3e98-4d0d-a2b7-62a89a434907" (UID: "073b4a27-3e98-4d0d-a2b7-62a89a434907"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.770016 4952 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/073b4a27-3e98-4d0d-a2b7-62a89a434907-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.770061 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-594zj\" (UniqueName: \"kubernetes.io/projected/073b4a27-3e98-4d0d-a2b7-62a89a434907-kube-api-access-594zj\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.770073 4952 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/073b4a27-3e98-4d0d-a2b7-62a89a434907-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.770084 4952 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/073b4a27-3e98-4d0d-a2b7-62a89a434907-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:54 crc kubenswrapper[4952]: I1122 03:02:54.770095 4952 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/073b4a27-3e98-4d0d-a2b7-62a89a434907-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:55 crc kubenswrapper[4952]: I1122 03:02:55.435298 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" event={"ID":"073b4a27-3e98-4d0d-a2b7-62a89a434907","Type":"ContainerDied","Data":"e5d8ac7c529cf0bc4c8d5be6ce3ab26e5ec207e8493d63769d79730a1b686cb1"} Nov 22 03:02:55 crc kubenswrapper[4952]: I1122 03:02:55.435381 4952 scope.go:117] "RemoveContainer" containerID="0977207663e74354579e8bb1ed76de546ca01a79756275e3ae66ba31b92d4b13" Nov 22 03:02:55 crc kubenswrapper[4952]: I1122 03:02:55.435426 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p9bf7" Nov 22 03:02:55 crc kubenswrapper[4952]: I1122 03:02:55.483371 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p9bf7"] Nov 22 03:02:55 crc kubenswrapper[4952]: I1122 03:02:55.486924 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p9bf7"] Nov 22 03:02:56 crc kubenswrapper[4952]: I1122 03:02:56.544836 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="073b4a27-3e98-4d0d-a2b7-62a89a434907" path="/var/lib/kubelet/pods/073b4a27-3e98-4d0d-a2b7-62a89a434907/volumes" Nov 22 03:02:58 crc kubenswrapper[4952]: I1122 03:02:58.342480 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:02:58 crc kubenswrapper[4952]: I1122 03:02:58.342592 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:03:28 crc kubenswrapper[4952]: I1122 03:03:28.341953 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:03:28 crc kubenswrapper[4952]: I1122 03:03:28.342821 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.130870 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-tfj8s"] Nov 22 03:03:47 crc kubenswrapper[4952]: E1122 03:03:47.131924 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="073b4a27-3e98-4d0d-a2b7-62a89a434907" containerName="registry" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.131942 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="073b4a27-3e98-4d0d-a2b7-62a89a434907" containerName="registry" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.132075 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="073b4a27-3e98-4d0d-a2b7-62a89a434907" containerName="registry" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.132686 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-tfj8s" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.135120 4952 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-zdx9g" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.136510 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.136572 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.137890 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-2zjtj"] Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.138789 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-2zjtj" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.140507 4952 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-bdcq2" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.150157 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-tfj8s"] Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.159829 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-2zjtj"] Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.163175 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-m52fp"] Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.164129 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-m52fp" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.167509 4952 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-bgqvn" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.174680 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-m52fp"] Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.190359 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfnr8\" (UniqueName: \"kubernetes.io/projected/17db6d4f-216b-4307-9625-962bc50bc029-kube-api-access-lfnr8\") pod \"cert-manager-webhook-5655c58dd6-m52fp\" (UID: \"17db6d4f-216b-4307-9625-962bc50bc029\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-m52fp" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.190410 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bd2n\" (UniqueName: \"kubernetes.io/projected/66e3470d-6627-44f4-be2d-b7f64ef73e9b-kube-api-access-6bd2n\") pod \"cert-manager-cainjector-7f985d654d-tfj8s\" (UID: \"66e3470d-6627-44f4-be2d-b7f64ef73e9b\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-tfj8s" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.190438 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq6jx\" (UniqueName: \"kubernetes.io/projected/a6cc04eb-89a2-4ef1-a147-0be36ae00d02-kube-api-access-lq6jx\") pod \"cert-manager-5b446d88c5-2zjtj\" (UID: \"a6cc04eb-89a2-4ef1-a147-0be36ae00d02\") " pod="cert-manager/cert-manager-5b446d88c5-2zjtj" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.291602 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bd2n\" (UniqueName: \"kubernetes.io/projected/66e3470d-6627-44f4-be2d-b7f64ef73e9b-kube-api-access-6bd2n\") pod \"cert-manager-cainjector-7f985d654d-tfj8s\" (UID: \"66e3470d-6627-44f4-be2d-b7f64ef73e9b\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-tfj8s" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.291666 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq6jx\" (UniqueName: \"kubernetes.io/projected/a6cc04eb-89a2-4ef1-a147-0be36ae00d02-kube-api-access-lq6jx\") pod \"cert-manager-5b446d88c5-2zjtj\" (UID: \"a6cc04eb-89a2-4ef1-a147-0be36ae00d02\") " pod="cert-manager/cert-manager-5b446d88c5-2zjtj" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.291748 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfnr8\" (UniqueName: \"kubernetes.io/projected/17db6d4f-216b-4307-9625-962bc50bc029-kube-api-access-lfnr8\") pod \"cert-manager-webhook-5655c58dd6-m52fp\" (UID: \"17db6d4f-216b-4307-9625-962bc50bc029\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-m52fp" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.312901 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bd2n\" (UniqueName: \"kubernetes.io/projected/66e3470d-6627-44f4-be2d-b7f64ef73e9b-kube-api-access-6bd2n\") pod \"cert-manager-cainjector-7f985d654d-tfj8s\" (UID: \"66e3470d-6627-44f4-be2d-b7f64ef73e9b\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-tfj8s" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.314192 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfnr8\" (UniqueName: \"kubernetes.io/projected/17db6d4f-216b-4307-9625-962bc50bc029-kube-api-access-lfnr8\") pod \"cert-manager-webhook-5655c58dd6-m52fp\" (UID: \"17db6d4f-216b-4307-9625-962bc50bc029\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-m52fp" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.315415 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq6jx\" (UniqueName: \"kubernetes.io/projected/a6cc04eb-89a2-4ef1-a147-0be36ae00d02-kube-api-access-lq6jx\") pod \"cert-manager-5b446d88c5-2zjtj\" (UID: \"a6cc04eb-89a2-4ef1-a147-0be36ae00d02\") " pod="cert-manager/cert-manager-5b446d88c5-2zjtj" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.455082 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-tfj8s" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.465856 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-2zjtj" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.480013 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-m52fp" Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.740587 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-tfj8s"] Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.762528 4952 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.764379 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-m52fp"] Nov 22 03:03:47 crc kubenswrapper[4952]: W1122 03:03:47.773287 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17db6d4f_216b_4307_9625_962bc50bc029.slice/crio-ad2f36d71d7ba5e3b9df267a4ca40218adee8e7ac60a21978e6ab76f004acc5f WatchSource:0}: Error finding container ad2f36d71d7ba5e3b9df267a4ca40218adee8e7ac60a21978e6ab76f004acc5f: Status 404 returned error can't find the container with id ad2f36d71d7ba5e3b9df267a4ca40218adee8e7ac60a21978e6ab76f004acc5f Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.819201 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-tfj8s" event={"ID":"66e3470d-6627-44f4-be2d-b7f64ef73e9b","Type":"ContainerStarted","Data":"3622334bedc6bb12094af0adbf860626ba0b673b6242f08372f06c4970cce4ad"} Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.820966 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-m52fp" event={"ID":"17db6d4f-216b-4307-9625-962bc50bc029","Type":"ContainerStarted","Data":"ad2f36d71d7ba5e3b9df267a4ca40218adee8e7ac60a21978e6ab76f004acc5f"} Nov 22 03:03:47 crc kubenswrapper[4952]: I1122 03:03:47.908971 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-2zjtj"] Nov 22 03:03:48 crc kubenswrapper[4952]: W1122 03:03:48.765734 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6cc04eb_89a2_4ef1_a147_0be36ae00d02.slice/crio-87f1c25b96c12151512bf36fbc3fc1f58873624127600a157eb3d81d3011653f WatchSource:0}: Error finding container 87f1c25b96c12151512bf36fbc3fc1f58873624127600a157eb3d81d3011653f: Status 404 returned error can't find the container with id 87f1c25b96c12151512bf36fbc3fc1f58873624127600a157eb3d81d3011653f Nov 22 03:03:48 crc kubenswrapper[4952]: I1122 03:03:48.834711 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-2zjtj" event={"ID":"a6cc04eb-89a2-4ef1-a147-0be36ae00d02","Type":"ContainerStarted","Data":"87f1c25b96c12151512bf36fbc3fc1f58873624127600a157eb3d81d3011653f"} Nov 22 03:03:51 crc kubenswrapper[4952]: I1122 03:03:51.858758 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-tfj8s" event={"ID":"66e3470d-6627-44f4-be2d-b7f64ef73e9b","Type":"ContainerStarted","Data":"fd3b2daa1f92391aaa176b808a49009fad10e74fb4c7eaf17c2967c3dcbc55ee"} Nov 22 03:03:51 crc kubenswrapper[4952]: I1122 03:03:51.862774 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-m52fp" event={"ID":"17db6d4f-216b-4307-9625-962bc50bc029","Type":"ContainerStarted","Data":"590ed611bccdbce57f5e5b39d80fb8c94a3cd3abf613aa6c4c298572a0e7b110"} Nov 22 03:03:51 crc kubenswrapper[4952]: I1122 03:03:51.862918 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-m52fp" Nov 22 03:03:51 crc kubenswrapper[4952]: I1122 03:03:51.865097 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-2zjtj" event={"ID":"a6cc04eb-89a2-4ef1-a147-0be36ae00d02","Type":"ContainerStarted","Data":"1f65717a4f3d27272524cda3f3e7612b9782d3f20b0c1761922754c7e5784307"} Nov 22 03:03:51 crc kubenswrapper[4952]: I1122 03:03:51.877643 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-tfj8s" podStartSLOduration=1.022948756 podStartE2EDuration="4.877618099s" podCreationTimestamp="2025-11-22 03:03:47 +0000 UTC" firstStartedPulling="2025-11-22 03:03:47.762242131 +0000 UTC m=+592.068259404" lastFinishedPulling="2025-11-22 03:03:51.616911464 +0000 UTC m=+595.922928747" observedRunningTime="2025-11-22 03:03:51.877009854 +0000 UTC m=+596.183027147" watchObservedRunningTime="2025-11-22 03:03:51.877618099 +0000 UTC m=+596.183635372" Nov 22 03:03:51 crc kubenswrapper[4952]: I1122 03:03:51.894100 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-2zjtj" podStartSLOduration=2.072396075 podStartE2EDuration="4.894073137s" podCreationTimestamp="2025-11-22 03:03:47 +0000 UTC" firstStartedPulling="2025-11-22 03:03:48.772686392 +0000 UTC m=+593.078703675" lastFinishedPulling="2025-11-22 03:03:51.594363404 +0000 UTC m=+595.900380737" observedRunningTime="2025-11-22 03:03:51.892363042 +0000 UTC m=+596.198380325" watchObservedRunningTime="2025-11-22 03:03:51.894073137 +0000 UTC m=+596.200090410" Nov 22 03:03:51 crc kubenswrapper[4952]: I1122 03:03:51.916352 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-m52fp" podStartSLOduration=1.099072702 podStartE2EDuration="4.916329669s" podCreationTimestamp="2025-11-22 03:03:47 +0000 UTC" firstStartedPulling="2025-11-22 03:03:47.77569688 +0000 UTC m=+592.081714153" lastFinishedPulling="2025-11-22 03:03:51.592953837 +0000 UTC m=+595.898971120" observedRunningTime="2025-11-22 03:03:51.914074309 +0000 UTC m=+596.220091592" watchObservedRunningTime="2025-11-22 03:03:51.916329669 +0000 UTC m=+596.222346952" Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.482942 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-m52fp" Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.666303 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qnw6b"] Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.666939 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="nbdb" containerID="cri-o://af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff" gracePeriod=30 Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.667183 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="sbdb" containerID="cri-o://3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de" gracePeriod=30 Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.667278 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0" gracePeriod=30 Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.667353 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="kube-rbac-proxy-node" containerID="cri-o://d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b" gracePeriod=30 Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.667321 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="northd" containerID="cri-o://2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281" gracePeriod=30 Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.667455 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovn-acl-logging" containerID="cri-o://10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0" gracePeriod=30 Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.667503 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovn-controller" containerID="cri-o://772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b" gracePeriod=30 Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.742086 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovnkube-controller" containerID="cri-o://3fb24293905b96fde1401820f7e9381bd272a84454d51902aac7e20cd67feba0" gracePeriod=30 Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.907631 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnw6b_bef051cd-2285-4b6b-a16f-1154f4d1f5dd/ovnkube-controller/3.log" Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.910239 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnw6b_bef051cd-2285-4b6b-a16f-1154f4d1f5dd/ovn-acl-logging/0.log" Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.911039 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnw6b_bef051cd-2285-4b6b-a16f-1154f4d1f5dd/ovn-controller/0.log" Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.911516 4952 generic.go:334] "Generic (PLEG): container finished" podID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerID="3fb24293905b96fde1401820f7e9381bd272a84454d51902aac7e20cd67feba0" exitCode=0 Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.911645 4952 generic.go:334] "Generic (PLEG): container finished" podID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerID="2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281" exitCode=0 Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.911688 4952 generic.go:334] "Generic (PLEG): container finished" podID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerID="6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0" exitCode=0 Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.911701 4952 generic.go:334] "Generic (PLEG): container finished" podID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerID="d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b" exitCode=0 Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.911713 4952 generic.go:334] "Generic (PLEG): container finished" podID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerID="10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0" exitCode=143 Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.911724 4952 generic.go:334] "Generic (PLEG): container finished" podID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerID="772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b" exitCode=143 Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.911598 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerDied","Data":"3fb24293905b96fde1401820f7e9381bd272a84454d51902aac7e20cd67feba0"} Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.911785 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerDied","Data":"2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281"} Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.911802 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerDied","Data":"6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0"} Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.911812 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerDied","Data":"d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b"} Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.911821 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerDied","Data":"10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0"} Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.911831 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerDied","Data":"772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b"} Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.911857 4952 scope.go:117] "RemoveContainer" containerID="b40468bea62f7fa68ccf47a00d09353678eb85c17469c2bec98094f34f8cc3a3" Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.914209 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-j9kg2_ccedfe81-43b3-4af7-88c7-9953b33e7d13/kube-multus/2.log" Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.914689 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-j9kg2_ccedfe81-43b3-4af7-88c7-9953b33e7d13/kube-multus/1.log" Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.914738 4952 generic.go:334] "Generic (PLEG): container finished" podID="ccedfe81-43b3-4af7-88c7-9953b33e7d13" containerID="cebc1e28cbfdd4056d2727f1ad546c42aae332550aa30af0ab61c05720129d31" exitCode=2 Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.914778 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j9kg2" event={"ID":"ccedfe81-43b3-4af7-88c7-9953b33e7d13","Type":"ContainerDied","Data":"cebc1e28cbfdd4056d2727f1ad546c42aae332550aa30af0ab61c05720129d31"} Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.915395 4952 scope.go:117] "RemoveContainer" containerID="cebc1e28cbfdd4056d2727f1ad546c42aae332550aa30af0ab61c05720129d31" Nov 22 03:03:57 crc kubenswrapper[4952]: E1122 03:03:57.915784 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-j9kg2_openshift-multus(ccedfe81-43b3-4af7-88c7-9953b33e7d13)\"" pod="openshift-multus/multus-j9kg2" podUID="ccedfe81-43b3-4af7-88c7-9953b33e7d13" Nov 22 03:03:57 crc kubenswrapper[4952]: I1122 03:03:57.987883 4952 scope.go:117] "RemoveContainer" containerID="99a339123c6ff672532d842ec6714aa7588d6fdbc03f39380e1c715613526782" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.021402 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnw6b_bef051cd-2285-4b6b-a16f-1154f4d1f5dd/ovn-acl-logging/0.log" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.022356 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnw6b_bef051cd-2285-4b6b-a16f-1154f4d1f5dd/ovn-controller/0.log" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.022969 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088071 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6w66c"] Nov 22 03:03:58 crc kubenswrapper[4952]: E1122 03:03:58.088287 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="sbdb" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088301 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="sbdb" Nov 22 03:03:58 crc kubenswrapper[4952]: E1122 03:03:58.088314 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovnkube-controller" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088320 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovnkube-controller" Nov 22 03:03:58 crc kubenswrapper[4952]: E1122 03:03:58.088328 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovnkube-controller" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088334 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovnkube-controller" Nov 22 03:03:58 crc kubenswrapper[4952]: E1122 03:03:58.088341 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovn-controller" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088347 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovn-controller" Nov 22 03:03:58 crc kubenswrapper[4952]: E1122 03:03:58.088354 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovnkube-controller" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088359 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovnkube-controller" Nov 22 03:03:58 crc kubenswrapper[4952]: E1122 03:03:58.088365 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="kubecfg-setup" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088372 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="kubecfg-setup" Nov 22 03:03:58 crc kubenswrapper[4952]: E1122 03:03:58.088385 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="northd" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088392 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="northd" Nov 22 03:03:58 crc kubenswrapper[4952]: E1122 03:03:58.088402 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="nbdb" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088408 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="nbdb" Nov 22 03:03:58 crc kubenswrapper[4952]: E1122 03:03:58.088415 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="kube-rbac-proxy-node" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088421 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="kube-rbac-proxy-node" Nov 22 03:03:58 crc kubenswrapper[4952]: E1122 03:03:58.088430 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovnkube-controller" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088435 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovnkube-controller" Nov 22 03:03:58 crc kubenswrapper[4952]: E1122 03:03:58.088442 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovn-acl-logging" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088448 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovn-acl-logging" Nov 22 03:03:58 crc kubenswrapper[4952]: E1122 03:03:58.088456 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="kube-rbac-proxy-ovn-metrics" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088462 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="kube-rbac-proxy-ovn-metrics" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088566 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovnkube-controller" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088573 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovnkube-controller" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088582 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovnkube-controller" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088589 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="northd" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088598 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="nbdb" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088604 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovn-controller" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088615 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="sbdb" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088622 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovnkube-controller" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088628 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="kube-rbac-proxy-ovn-metrics" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088636 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="kube-rbac-proxy-node" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088642 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovn-acl-logging" Nov 22 03:03:58 crc kubenswrapper[4952]: E1122 03:03:58.088734 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovnkube-controller" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088740 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovnkube-controller" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.088825 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerName="ovnkube-controller" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.090304 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.158817 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-cni-bin\") pod \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.158977 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-run-netns\") pod \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.158980 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "bef051cd-2285-4b6b-a16f-1154f4d1f5dd" (UID: "bef051cd-2285-4b6b-a16f-1154f4d1f5dd"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159027 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-kubelet\") pod \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159086 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-log-socket\") pod \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159117 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-systemd-units\") pod \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159120 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "bef051cd-2285-4b6b-a16f-1154f4d1f5dd" (UID: "bef051cd-2285-4b6b-a16f-1154f4d1f5dd"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159152 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-log-socket" (OuterVolumeSpecName: "log-socket") pod "bef051cd-2285-4b6b-a16f-1154f4d1f5dd" (UID: "bef051cd-2285-4b6b-a16f-1154f4d1f5dd"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159180 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-ovnkube-config\") pod \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159199 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "bef051cd-2285-4b6b-a16f-1154f4d1f5dd" (UID: "bef051cd-2285-4b6b-a16f-1154f4d1f5dd"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159189 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "bef051cd-2285-4b6b-a16f-1154f4d1f5dd" (UID: "bef051cd-2285-4b6b-a16f-1154f4d1f5dd"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159227 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-ovn-node-metrics-cert\") pod \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159264 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-run-systemd\") pod \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159300 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-run-ovn-kubernetes\") pod \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159335 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-slash\") pod \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159371 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdmhx\" (UniqueName: \"kubernetes.io/projected/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-kube-api-access-jdmhx\") pod \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159412 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-node-log\") pod \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159441 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-etc-openvswitch\") pod \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159444 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "bef051cd-2285-4b6b-a16f-1154f4d1f5dd" (UID: "bef051cd-2285-4b6b-a16f-1154f4d1f5dd"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159459 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-slash" (OuterVolumeSpecName: "host-slash") pod "bef051cd-2285-4b6b-a16f-1154f4d1f5dd" (UID: "bef051cd-2285-4b6b-a16f-1154f4d1f5dd"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159488 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159525 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-node-log" (OuterVolumeSpecName: "node-log") pod "bef051cd-2285-4b6b-a16f-1154f4d1f5dd" (UID: "bef051cd-2285-4b6b-a16f-1154f4d1f5dd"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159601 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "bef051cd-2285-4b6b-a16f-1154f4d1f5dd" (UID: "bef051cd-2285-4b6b-a16f-1154f4d1f5dd"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159639 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "bef051cd-2285-4b6b-a16f-1154f4d1f5dd" (UID: "bef051cd-2285-4b6b-a16f-1154f4d1f5dd"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159583 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-env-overrides\") pod \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159808 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-cni-netd\") pod \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159856 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-run-openvswitch\") pod \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159859 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "bef051cd-2285-4b6b-a16f-1154f4d1f5dd" (UID: "bef051cd-2285-4b6b-a16f-1154f4d1f5dd"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159890 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-ovnkube-script-lib\") pod \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159915 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-var-lib-openvswitch\") pod \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.159945 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-run-ovn\") pod \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\" (UID: \"bef051cd-2285-4b6b-a16f-1154f4d1f5dd\") " Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160037 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "bef051cd-2285-4b6b-a16f-1154f4d1f5dd" (UID: "bef051cd-2285-4b6b-a16f-1154f4d1f5dd"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160057 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "bef051cd-2285-4b6b-a16f-1154f4d1f5dd" (UID: "bef051cd-2285-4b6b-a16f-1154f4d1f5dd"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160082 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "bef051cd-2285-4b6b-a16f-1154f4d1f5dd" (UID: "bef051cd-2285-4b6b-a16f-1154f4d1f5dd"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160094 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "bef051cd-2285-4b6b-a16f-1154f4d1f5dd" (UID: "bef051cd-2285-4b6b-a16f-1154f4d1f5dd"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160207 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "bef051cd-2285-4b6b-a16f-1154f4d1f5dd" (UID: "bef051cd-2285-4b6b-a16f-1154f4d1f5dd"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160399 4952 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160418 4952 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160433 4952 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-log-socket\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160446 4952 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160458 4952 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160471 4952 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160485 4952 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-slash\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160499 4952 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-node-log\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160515 4952 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160529 4952 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160573 4952 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160594 4952 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160611 4952 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160630 4952 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160645 4952 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160656 4952 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.160786 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "bef051cd-2285-4b6b-a16f-1154f4d1f5dd" (UID: "bef051cd-2285-4b6b-a16f-1154f4d1f5dd"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.165536 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-kube-api-access-jdmhx" (OuterVolumeSpecName: "kube-api-access-jdmhx") pod "bef051cd-2285-4b6b-a16f-1154f4d1f5dd" (UID: "bef051cd-2285-4b6b-a16f-1154f4d1f5dd"). InnerVolumeSpecName "kube-api-access-jdmhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.165703 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "bef051cd-2285-4b6b-a16f-1154f4d1f5dd" (UID: "bef051cd-2285-4b6b-a16f-1154f4d1f5dd"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.177344 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "bef051cd-2285-4b6b-a16f-1154f4d1f5dd" (UID: "bef051cd-2285-4b6b-a16f-1154f4d1f5dd"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.262452 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-cni-netd\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.262606 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.262653 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-var-lib-openvswitch\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.262698 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-run-openvswitch\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.262843 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-run-ovn\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.263533 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-run-ovn-kubernetes\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.263657 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-slash\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.263722 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a7a81d42-b69b-452d-a8ee-eada127106e8-ovn-node-metrics-cert\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.263768 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-cni-bin\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.263803 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a7a81d42-b69b-452d-a8ee-eada127106e8-env-overrides\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.263870 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-run-systemd\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.263987 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-run-netns\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.264119 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72gh6\" (UniqueName: \"kubernetes.io/projected/a7a81d42-b69b-452d-a8ee-eada127106e8-kube-api-access-72gh6\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.264181 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-kubelet\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.264210 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-etc-openvswitch\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.264269 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-node-log\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.264421 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-log-socket\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.264497 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a7a81d42-b69b-452d-a8ee-eada127106e8-ovnkube-config\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.264574 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-systemd-units\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.264616 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a7a81d42-b69b-452d-a8ee-eada127106e8-ovnkube-script-lib\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.264733 4952 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.264762 4952 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.264782 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdmhx\" (UniqueName: \"kubernetes.io/projected/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-kube-api-access-jdmhx\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.264804 4952 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bef051cd-2285-4b6b-a16f-1154f4d1f5dd-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.342483 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.342617 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.342686 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.343495 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0cf1c8c9fd6e281870ad88809e9296851217d3eb9921ce023095d72e4315fecb"} pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.343597 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" containerID="cri-o://0cf1c8c9fd6e281870ad88809e9296851217d3eb9921ce023095d72e4315fecb" gracePeriod=600 Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.365945 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a7a81d42-b69b-452d-a8ee-eada127106e8-ovn-node-metrics-cert\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366012 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-cni-bin\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366046 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a7a81d42-b69b-452d-a8ee-eada127106e8-env-overrides\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366080 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-run-systemd\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366109 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-run-netns\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366142 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72gh6\" (UniqueName: \"kubernetes.io/projected/a7a81d42-b69b-452d-a8ee-eada127106e8-kube-api-access-72gh6\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366147 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-cni-bin\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366180 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-kubelet\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366211 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-etc-openvswitch\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366248 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-node-log\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366268 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-run-netns\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366332 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-log-socket\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366367 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-run-systemd\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366370 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a7a81d42-b69b-452d-a8ee-eada127106e8-ovnkube-config\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366458 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-systemd-units\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366463 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-node-log\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366565 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a7a81d42-b69b-452d-a8ee-eada127106e8-ovnkube-script-lib\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366621 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-kubelet\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366661 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-etc-openvswitch\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366671 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-cni-netd\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366625 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-cni-netd\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366774 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-log-socket\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366825 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366867 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-var-lib-openvswitch\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366930 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-run-openvswitch\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366970 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-run-ovn\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.366674 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-systemd-units\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.367049 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-run-ovn\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.367053 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-var-lib-openvswitch\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.367059 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.367090 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-run-openvswitch\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.367225 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-run-ovn-kubernetes\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.367257 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-slash\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.367261 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-run-ovn-kubernetes\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.367303 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a7a81d42-b69b-452d-a8ee-eada127106e8-host-slash\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.368184 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a7a81d42-b69b-452d-a8ee-eada127106e8-ovnkube-config\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.368186 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a7a81d42-b69b-452d-a8ee-eada127106e8-ovnkube-script-lib\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.368519 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a7a81d42-b69b-452d-a8ee-eada127106e8-env-overrides\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.371020 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a7a81d42-b69b-452d-a8ee-eada127106e8-ovn-node-metrics-cert\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.387839 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72gh6\" (UniqueName: \"kubernetes.io/projected/a7a81d42-b69b-452d-a8ee-eada127106e8-kube-api-access-72gh6\") pod \"ovnkube-node-6w66c\" (UID: \"a7a81d42-b69b-452d-a8ee-eada127106e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.415409 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:03:58 crc kubenswrapper[4952]: W1122 03:03:58.434515 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7a81d42_b69b_452d_a8ee_eada127106e8.slice/crio-638b70c4c6c8bf47aadad2aa74b1a482bef860752f3f07150aa8930ecf6b8e23 WatchSource:0}: Error finding container 638b70c4c6c8bf47aadad2aa74b1a482bef860752f3f07150aa8930ecf6b8e23: Status 404 returned error can't find the container with id 638b70c4c6c8bf47aadad2aa74b1a482bef860752f3f07150aa8930ecf6b8e23 Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.926057 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-j9kg2_ccedfe81-43b3-4af7-88c7-9953b33e7d13/kube-multus/2.log" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.942899 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnw6b_bef051cd-2285-4b6b-a16f-1154f4d1f5dd/ovn-acl-logging/0.log" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.944043 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnw6b_bef051cd-2285-4b6b-a16f-1154f4d1f5dd/ovn-controller/0.log" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.945775 4952 generic.go:334] "Generic (PLEG): container finished" podID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerID="3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de" exitCode=0 Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.945819 4952 generic.go:334] "Generic (PLEG): container finished" podID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" containerID="af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff" exitCode=0 Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.945878 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerDied","Data":"3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de"} Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.945965 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.945993 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerDied","Data":"af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff"} Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.946034 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnw6b" event={"ID":"bef051cd-2285-4b6b-a16f-1154f4d1f5dd","Type":"ContainerDied","Data":"50dc2b64957657973595ac2fe0f873643bc0931db98df6606c180a3e370fef94"} Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.946083 4952 scope.go:117] "RemoveContainer" containerID="3fb24293905b96fde1401820f7e9381bd272a84454d51902aac7e20cd67feba0" Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.948995 4952 generic.go:334] "Generic (PLEG): container finished" podID="a7a81d42-b69b-452d-a8ee-eada127106e8" containerID="686a607d54c86c2540e9a61dee028a869cb35fc4753c5b58a1c8354043c02333" exitCode=0 Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.949299 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" event={"ID":"a7a81d42-b69b-452d-a8ee-eada127106e8","Type":"ContainerDied","Data":"686a607d54c86c2540e9a61dee028a869cb35fc4753c5b58a1c8354043c02333"} Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.949534 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" event={"ID":"a7a81d42-b69b-452d-a8ee-eada127106e8","Type":"ContainerStarted","Data":"638b70c4c6c8bf47aadad2aa74b1a482bef860752f3f07150aa8930ecf6b8e23"} Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.955723 4952 generic.go:334] "Generic (PLEG): container finished" podID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerID="0cf1c8c9fd6e281870ad88809e9296851217d3eb9921ce023095d72e4315fecb" exitCode=0 Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.955808 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerDied","Data":"0cf1c8c9fd6e281870ad88809e9296851217d3eb9921ce023095d72e4315fecb"} Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.955873 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"5f35d23af81d1d053b0cb10ef07f55474bcfadceb139bb522d996b063f18401b"} Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.985516 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qnw6b"] Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.991990 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qnw6b"] Nov 22 03:03:58 crc kubenswrapper[4952]: I1122 03:03:58.995068 4952 scope.go:117] "RemoveContainer" containerID="3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.016125 4952 scope.go:117] "RemoveContainer" containerID="af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.053796 4952 scope.go:117] "RemoveContainer" containerID="2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.095155 4952 scope.go:117] "RemoveContainer" containerID="6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.116330 4952 scope.go:117] "RemoveContainer" containerID="d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.132719 4952 scope.go:117] "RemoveContainer" containerID="10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.148824 4952 scope.go:117] "RemoveContainer" containerID="772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.168900 4952 scope.go:117] "RemoveContainer" containerID="57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.192128 4952 scope.go:117] "RemoveContainer" containerID="3fb24293905b96fde1401820f7e9381bd272a84454d51902aac7e20cd67feba0" Nov 22 03:03:59 crc kubenswrapper[4952]: E1122 03:03:59.193133 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fb24293905b96fde1401820f7e9381bd272a84454d51902aac7e20cd67feba0\": container with ID starting with 3fb24293905b96fde1401820f7e9381bd272a84454d51902aac7e20cd67feba0 not found: ID does not exist" containerID="3fb24293905b96fde1401820f7e9381bd272a84454d51902aac7e20cd67feba0" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.193253 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fb24293905b96fde1401820f7e9381bd272a84454d51902aac7e20cd67feba0"} err="failed to get container status \"3fb24293905b96fde1401820f7e9381bd272a84454d51902aac7e20cd67feba0\": rpc error: code = NotFound desc = could not find container \"3fb24293905b96fde1401820f7e9381bd272a84454d51902aac7e20cd67feba0\": container with ID starting with 3fb24293905b96fde1401820f7e9381bd272a84454d51902aac7e20cd67feba0 not found: ID does not exist" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.193342 4952 scope.go:117] "RemoveContainer" containerID="3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de" Nov 22 03:03:59 crc kubenswrapper[4952]: E1122 03:03:59.194066 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\": container with ID starting with 3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de not found: ID does not exist" containerID="3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.194146 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de"} err="failed to get container status \"3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\": rpc error: code = NotFound desc = could not find container \"3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\": container with ID starting with 3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de not found: ID does not exist" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.194176 4952 scope.go:117] "RemoveContainer" containerID="af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff" Nov 22 03:03:59 crc kubenswrapper[4952]: E1122 03:03:59.194648 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\": container with ID starting with af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff not found: ID does not exist" containerID="af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.194731 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff"} err="failed to get container status \"af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\": rpc error: code = NotFound desc = could not find container \"af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\": container with ID starting with af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff not found: ID does not exist" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.194795 4952 scope.go:117] "RemoveContainer" containerID="2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281" Nov 22 03:03:59 crc kubenswrapper[4952]: E1122 03:03:59.195291 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\": container with ID starting with 2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281 not found: ID does not exist" containerID="2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.195397 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281"} err="failed to get container status \"2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\": rpc error: code = NotFound desc = could not find container \"2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\": container with ID starting with 2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281 not found: ID does not exist" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.195494 4952 scope.go:117] "RemoveContainer" containerID="6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0" Nov 22 03:03:59 crc kubenswrapper[4952]: E1122 03:03:59.195984 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\": container with ID starting with 6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0 not found: ID does not exist" containerID="6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.196062 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0"} err="failed to get container status \"6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\": rpc error: code = NotFound desc = could not find container \"6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\": container with ID starting with 6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0 not found: ID does not exist" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.196090 4952 scope.go:117] "RemoveContainer" containerID="d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b" Nov 22 03:03:59 crc kubenswrapper[4952]: E1122 03:03:59.196602 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\": container with ID starting with d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b not found: ID does not exist" containerID="d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.196696 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b"} err="failed to get container status \"d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\": rpc error: code = NotFound desc = could not find container \"d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\": container with ID starting with d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b not found: ID does not exist" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.196738 4952 scope.go:117] "RemoveContainer" containerID="10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0" Nov 22 03:03:59 crc kubenswrapper[4952]: E1122 03:03:59.197212 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\": container with ID starting with 10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0 not found: ID does not exist" containerID="10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.197275 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0"} err="failed to get container status \"10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\": rpc error: code = NotFound desc = could not find container \"10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\": container with ID starting with 10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0 not found: ID does not exist" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.197322 4952 scope.go:117] "RemoveContainer" containerID="772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b" Nov 22 03:03:59 crc kubenswrapper[4952]: E1122 03:03:59.197991 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\": container with ID starting with 772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b not found: ID does not exist" containerID="772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.198043 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b"} err="failed to get container status \"772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\": rpc error: code = NotFound desc = could not find container \"772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\": container with ID starting with 772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b not found: ID does not exist" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.198124 4952 scope.go:117] "RemoveContainer" containerID="57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903" Nov 22 03:03:59 crc kubenswrapper[4952]: E1122 03:03:59.198603 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\": container with ID starting with 57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903 not found: ID does not exist" containerID="57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.198649 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903"} err="failed to get container status \"57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\": rpc error: code = NotFound desc = could not find container \"57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\": container with ID starting with 57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903 not found: ID does not exist" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.198683 4952 scope.go:117] "RemoveContainer" containerID="3fb24293905b96fde1401820f7e9381bd272a84454d51902aac7e20cd67feba0" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.199359 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fb24293905b96fde1401820f7e9381bd272a84454d51902aac7e20cd67feba0"} err="failed to get container status \"3fb24293905b96fde1401820f7e9381bd272a84454d51902aac7e20cd67feba0\": rpc error: code = NotFound desc = could not find container \"3fb24293905b96fde1401820f7e9381bd272a84454d51902aac7e20cd67feba0\": container with ID starting with 3fb24293905b96fde1401820f7e9381bd272a84454d51902aac7e20cd67feba0 not found: ID does not exist" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.199485 4952 scope.go:117] "RemoveContainer" containerID="3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.200319 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de"} err="failed to get container status \"3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\": rpc error: code = NotFound desc = could not find container \"3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de\": container with ID starting with 3beb68d15c5caa4b604c4e83a2691040013afa0ab2ed882debfc623ee60d45de not found: ID does not exist" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.200449 4952 scope.go:117] "RemoveContainer" containerID="af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.201258 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff"} err="failed to get container status \"af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\": rpc error: code = NotFound desc = could not find container \"af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff\": container with ID starting with af0005f979bc8dba8b3e936545c1eda79d822532b76bca471e8dcf46e2811cff not found: ID does not exist" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.201365 4952 scope.go:117] "RemoveContainer" containerID="2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.202009 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281"} err="failed to get container status \"2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\": rpc error: code = NotFound desc = could not find container \"2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281\": container with ID starting with 2d0ce38acf2ba3328da0e02f49670ad3e6a4eead7d1a646f0cd597dc18331281 not found: ID does not exist" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.202089 4952 scope.go:117] "RemoveContainer" containerID="6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.202898 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0"} err="failed to get container status \"6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\": rpc error: code = NotFound desc = could not find container \"6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0\": container with ID starting with 6059b37cee2103a5dc2cec58fec15eaef99fd5a51875ab4b1f6b9c7d8d038ef0 not found: ID does not exist" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.202920 4952 scope.go:117] "RemoveContainer" containerID="d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.203347 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b"} err="failed to get container status \"d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\": rpc error: code = NotFound desc = could not find container \"d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b\": container with ID starting with d909f1c53e60bd269b30a4af76c4b2b501e6cbfc771496bf9dda0a460982f44b not found: ID does not exist" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.203367 4952 scope.go:117] "RemoveContainer" containerID="10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.203641 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0"} err="failed to get container status \"10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\": rpc error: code = NotFound desc = could not find container \"10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0\": container with ID starting with 10e416be91d56daea456389c40f7aaf830157c370433e244d18e26249dc1ecf0 not found: ID does not exist" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.203658 4952 scope.go:117] "RemoveContainer" containerID="772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.203955 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b"} err="failed to get container status \"772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\": rpc error: code = NotFound desc = could not find container \"772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b\": container with ID starting with 772422abfe1e3164bac10ad49d197f59ad00bd75c5ccf9de390894373b1ec12b not found: ID does not exist" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.203968 4952 scope.go:117] "RemoveContainer" containerID="57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.204255 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903"} err="failed to get container status \"57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\": rpc error: code = NotFound desc = could not find container \"57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903\": container with ID starting with 57198fa86518ba4224a59d6f05dce4427475f51063d16cc0f31f5a8d214df903 not found: ID does not exist" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.204272 4952 scope.go:117] "RemoveContainer" containerID="283175bce2bea1ef54ec44437a69cc09a90aa62ad36cd51f79f6632b87a3f11a" Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.970661 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" event={"ID":"a7a81d42-b69b-452d-a8ee-eada127106e8","Type":"ContainerStarted","Data":"a1dafbb96dd2c362b3d89bf70ffd0d060558c0b610e4aef0c1ee3dd9db5c8c18"} Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.971233 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" event={"ID":"a7a81d42-b69b-452d-a8ee-eada127106e8","Type":"ContainerStarted","Data":"5ecd9f339974dd5287c12c5f78c590bf42cfddc724a24f013d5b2dec42ed1592"} Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.971255 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" event={"ID":"a7a81d42-b69b-452d-a8ee-eada127106e8","Type":"ContainerStarted","Data":"f4e94e843d6b82d1e65d1dafc08a0605f60ceadf7020c9d11412979e5dc4a8b5"} Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.971268 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" event={"ID":"a7a81d42-b69b-452d-a8ee-eada127106e8","Type":"ContainerStarted","Data":"414422fb0ffd7d3f2906fcf6cec6a1e767bbaaa901460386bf56f64c06398957"} Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.971283 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" event={"ID":"a7a81d42-b69b-452d-a8ee-eada127106e8","Type":"ContainerStarted","Data":"531fb68bc7f4d0bcf155fbe50b957ba09763a6de047c76ecb875d164000aa11e"} Nov 22 03:03:59 crc kubenswrapper[4952]: I1122 03:03:59.971296 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" event={"ID":"a7a81d42-b69b-452d-a8ee-eada127106e8","Type":"ContainerStarted","Data":"c7450384f1930a218d5dfffd1329ebec1946195694d81b64ce7eaa0ee7fa2464"} Nov 22 03:04:00 crc kubenswrapper[4952]: I1122 03:04:00.540392 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bef051cd-2285-4b6b-a16f-1154f4d1f5dd" path="/var/lib/kubelet/pods/bef051cd-2285-4b6b-a16f-1154f4d1f5dd/volumes" Nov 22 03:04:03 crc kubenswrapper[4952]: I1122 03:04:03.000854 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" event={"ID":"a7a81d42-b69b-452d-a8ee-eada127106e8","Type":"ContainerStarted","Data":"d2988931644f65cc6a5cb7aae4f6d2484a720d60da6ee256d1828548dfc96414"} Nov 22 03:04:05 crc kubenswrapper[4952]: I1122 03:04:05.022348 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" event={"ID":"a7a81d42-b69b-452d-a8ee-eada127106e8","Type":"ContainerStarted","Data":"f0c460999a35c4c8df3304dea1c33f3f082bee9156c152ebdde6cd83b0f53f14"} Nov 22 03:04:05 crc kubenswrapper[4952]: I1122 03:04:05.023206 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:04:05 crc kubenswrapper[4952]: I1122 03:04:05.023226 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:04:05 crc kubenswrapper[4952]: I1122 03:04:05.056794 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" podStartSLOduration=7.056772616 podStartE2EDuration="7.056772616s" podCreationTimestamp="2025-11-22 03:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:04:05.050958001 +0000 UTC m=+609.356975274" watchObservedRunningTime="2025-11-22 03:04:05.056772616 +0000 UTC m=+609.362789889" Nov 22 03:04:05 crc kubenswrapper[4952]: I1122 03:04:05.069980 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:04:06 crc kubenswrapper[4952]: I1122 03:04:06.029518 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:04:06 crc kubenswrapper[4952]: I1122 03:04:06.077285 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:04:12 crc kubenswrapper[4952]: I1122 03:04:12.531918 4952 scope.go:117] "RemoveContainer" containerID="cebc1e28cbfdd4056d2727f1ad546c42aae332550aa30af0ab61c05720129d31" Nov 22 03:04:12 crc kubenswrapper[4952]: E1122 03:04:12.532881 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-j9kg2_openshift-multus(ccedfe81-43b3-4af7-88c7-9953b33e7d13)\"" pod="openshift-multus/multus-j9kg2" podUID="ccedfe81-43b3-4af7-88c7-9953b33e7d13" Nov 22 03:04:26 crc kubenswrapper[4952]: I1122 03:04:26.537994 4952 scope.go:117] "RemoveContainer" containerID="cebc1e28cbfdd4056d2727f1ad546c42aae332550aa30af0ab61c05720129d31" Nov 22 03:04:27 crc kubenswrapper[4952]: I1122 03:04:27.192111 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-j9kg2_ccedfe81-43b3-4af7-88c7-9953b33e7d13/kube-multus/2.log" Nov 22 03:04:27 crc kubenswrapper[4952]: I1122 03:04:27.192943 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j9kg2" event={"ID":"ccedfe81-43b3-4af7-88c7-9953b33e7d13","Type":"ContainerStarted","Data":"309c492ed536b5baa41fcb236a162b447ae4cacb8df38695f50b85b5efeca21d"} Nov 22 03:04:28 crc kubenswrapper[4952]: I1122 03:04:28.465433 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6w66c" Nov 22 03:04:40 crc kubenswrapper[4952]: I1122 03:04:40.545444 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2"] Nov 22 03:04:40 crc kubenswrapper[4952]: I1122 03:04:40.548868 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2" Nov 22 03:04:40 crc kubenswrapper[4952]: I1122 03:04:40.551477 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 22 03:04:40 crc kubenswrapper[4952]: I1122 03:04:40.561034 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2"] Nov 22 03:04:40 crc kubenswrapper[4952]: I1122 03:04:40.721419 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/905bd3d3-e252-45e1-8d2d-287b04287e7d-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2\" (UID: \"905bd3d3-e252-45e1-8d2d-287b04287e7d\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2" Nov 22 03:04:40 crc kubenswrapper[4952]: I1122 03:04:40.721528 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kb5m\" (UniqueName: \"kubernetes.io/projected/905bd3d3-e252-45e1-8d2d-287b04287e7d-kube-api-access-5kb5m\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2\" (UID: \"905bd3d3-e252-45e1-8d2d-287b04287e7d\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2" Nov 22 03:04:40 crc kubenswrapper[4952]: I1122 03:04:40.721622 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/905bd3d3-e252-45e1-8d2d-287b04287e7d-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2\" (UID: \"905bd3d3-e252-45e1-8d2d-287b04287e7d\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2" Nov 22 03:04:40 crc kubenswrapper[4952]: I1122 03:04:40.822511 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/905bd3d3-e252-45e1-8d2d-287b04287e7d-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2\" (UID: \"905bd3d3-e252-45e1-8d2d-287b04287e7d\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2" Nov 22 03:04:40 crc kubenswrapper[4952]: I1122 03:04:40.822589 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kb5m\" (UniqueName: \"kubernetes.io/projected/905bd3d3-e252-45e1-8d2d-287b04287e7d-kube-api-access-5kb5m\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2\" (UID: \"905bd3d3-e252-45e1-8d2d-287b04287e7d\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2" Nov 22 03:04:40 crc kubenswrapper[4952]: I1122 03:04:40.822652 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/905bd3d3-e252-45e1-8d2d-287b04287e7d-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2\" (UID: \"905bd3d3-e252-45e1-8d2d-287b04287e7d\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2" Nov 22 03:04:40 crc kubenswrapper[4952]: I1122 03:04:40.823196 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/905bd3d3-e252-45e1-8d2d-287b04287e7d-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2\" (UID: \"905bd3d3-e252-45e1-8d2d-287b04287e7d\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2" Nov 22 03:04:40 crc kubenswrapper[4952]: I1122 03:04:40.824015 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/905bd3d3-e252-45e1-8d2d-287b04287e7d-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2\" (UID: \"905bd3d3-e252-45e1-8d2d-287b04287e7d\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2" Nov 22 03:04:40 crc kubenswrapper[4952]: I1122 03:04:40.850936 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kb5m\" (UniqueName: \"kubernetes.io/projected/905bd3d3-e252-45e1-8d2d-287b04287e7d-kube-api-access-5kb5m\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2\" (UID: \"905bd3d3-e252-45e1-8d2d-287b04287e7d\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2" Nov 22 03:04:40 crc kubenswrapper[4952]: I1122 03:04:40.901605 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2" Nov 22 03:04:41 crc kubenswrapper[4952]: I1122 03:04:41.212414 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2"] Nov 22 03:04:41 crc kubenswrapper[4952]: I1122 03:04:41.288579 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2" event={"ID":"905bd3d3-e252-45e1-8d2d-287b04287e7d","Type":"ContainerStarted","Data":"93ee9c183f7f24412f2a75e3f0db025e7def8e4f0e27d41ed2fecbd5a8bf85c9"} Nov 22 03:04:42 crc kubenswrapper[4952]: I1122 03:04:42.300196 4952 generic.go:334] "Generic (PLEG): container finished" podID="905bd3d3-e252-45e1-8d2d-287b04287e7d" containerID="06fcf32c14e448c7609bc3b3eb7e76dddc5f978245f34c39d27efc5ac890aac9" exitCode=0 Nov 22 03:04:42 crc kubenswrapper[4952]: I1122 03:04:42.300294 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2" event={"ID":"905bd3d3-e252-45e1-8d2d-287b04287e7d","Type":"ContainerDied","Data":"06fcf32c14e448c7609bc3b3eb7e76dddc5f978245f34c39d27efc5ac890aac9"} Nov 22 03:04:44 crc kubenswrapper[4952]: I1122 03:04:44.319981 4952 generic.go:334] "Generic (PLEG): container finished" podID="905bd3d3-e252-45e1-8d2d-287b04287e7d" containerID="0523f4a498db6e1c02c8d7b0b5aed09a883827694d26895d3dce601887c57519" exitCode=0 Nov 22 03:04:44 crc kubenswrapper[4952]: I1122 03:04:44.320129 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2" event={"ID":"905bd3d3-e252-45e1-8d2d-287b04287e7d","Type":"ContainerDied","Data":"0523f4a498db6e1c02c8d7b0b5aed09a883827694d26895d3dce601887c57519"} Nov 22 03:04:45 crc kubenswrapper[4952]: I1122 03:04:45.327680 4952 generic.go:334] "Generic (PLEG): container finished" podID="905bd3d3-e252-45e1-8d2d-287b04287e7d" containerID="c4058018a552f485b7c53706a5f113282e1388c498ad75569a3fa94a23ffac5e" exitCode=0 Nov 22 03:04:45 crc kubenswrapper[4952]: I1122 03:04:45.327774 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2" event={"ID":"905bd3d3-e252-45e1-8d2d-287b04287e7d","Type":"ContainerDied","Data":"c4058018a552f485b7c53706a5f113282e1388c498ad75569a3fa94a23ffac5e"} Nov 22 03:04:46 crc kubenswrapper[4952]: I1122 03:04:46.674041 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2" Nov 22 03:04:46 crc kubenswrapper[4952]: I1122 03:04:46.815002 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kb5m\" (UniqueName: \"kubernetes.io/projected/905bd3d3-e252-45e1-8d2d-287b04287e7d-kube-api-access-5kb5m\") pod \"905bd3d3-e252-45e1-8d2d-287b04287e7d\" (UID: \"905bd3d3-e252-45e1-8d2d-287b04287e7d\") " Nov 22 03:04:46 crc kubenswrapper[4952]: I1122 03:04:46.815223 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/905bd3d3-e252-45e1-8d2d-287b04287e7d-util\") pod \"905bd3d3-e252-45e1-8d2d-287b04287e7d\" (UID: \"905bd3d3-e252-45e1-8d2d-287b04287e7d\") " Nov 22 03:04:46 crc kubenswrapper[4952]: I1122 03:04:46.815267 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/905bd3d3-e252-45e1-8d2d-287b04287e7d-bundle\") pod \"905bd3d3-e252-45e1-8d2d-287b04287e7d\" (UID: \"905bd3d3-e252-45e1-8d2d-287b04287e7d\") " Nov 22 03:04:46 crc kubenswrapper[4952]: I1122 03:04:46.816083 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/905bd3d3-e252-45e1-8d2d-287b04287e7d-bundle" (OuterVolumeSpecName: "bundle") pod "905bd3d3-e252-45e1-8d2d-287b04287e7d" (UID: "905bd3d3-e252-45e1-8d2d-287b04287e7d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:04:46 crc kubenswrapper[4952]: I1122 03:04:46.821429 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/905bd3d3-e252-45e1-8d2d-287b04287e7d-kube-api-access-5kb5m" (OuterVolumeSpecName: "kube-api-access-5kb5m") pod "905bd3d3-e252-45e1-8d2d-287b04287e7d" (UID: "905bd3d3-e252-45e1-8d2d-287b04287e7d"). InnerVolumeSpecName "kube-api-access-5kb5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:04:46 crc kubenswrapper[4952]: I1122 03:04:46.916441 4952 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/905bd3d3-e252-45e1-8d2d-287b04287e7d-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:04:46 crc kubenswrapper[4952]: I1122 03:04:46.916476 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kb5m\" (UniqueName: \"kubernetes.io/projected/905bd3d3-e252-45e1-8d2d-287b04287e7d-kube-api-access-5kb5m\") on node \"crc\" DevicePath \"\"" Nov 22 03:04:47 crc kubenswrapper[4952]: I1122 03:04:47.257227 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/905bd3d3-e252-45e1-8d2d-287b04287e7d-util" (OuterVolumeSpecName: "util") pod "905bd3d3-e252-45e1-8d2d-287b04287e7d" (UID: "905bd3d3-e252-45e1-8d2d-287b04287e7d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:04:47 crc kubenswrapper[4952]: I1122 03:04:47.321388 4952 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/905bd3d3-e252-45e1-8d2d-287b04287e7d-util\") on node \"crc\" DevicePath \"\"" Nov 22 03:04:47 crc kubenswrapper[4952]: I1122 03:04:47.349027 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2" event={"ID":"905bd3d3-e252-45e1-8d2d-287b04287e7d","Type":"ContainerDied","Data":"93ee9c183f7f24412f2a75e3f0db025e7def8e4f0e27d41ed2fecbd5a8bf85c9"} Nov 22 03:04:47 crc kubenswrapper[4952]: I1122 03:04:47.349101 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93ee9c183f7f24412f2a75e3f0db025e7def8e4f0e27d41ed2fecbd5a8bf85c9" Nov 22 03:04:47 crc kubenswrapper[4952]: I1122 03:04:47.349172 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2" Nov 22 03:04:49 crc kubenswrapper[4952]: I1122 03:04:49.225148 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-n49rs"] Nov 22 03:04:49 crc kubenswrapper[4952]: E1122 03:04:49.225405 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="905bd3d3-e252-45e1-8d2d-287b04287e7d" containerName="extract" Nov 22 03:04:49 crc kubenswrapper[4952]: I1122 03:04:49.225418 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="905bd3d3-e252-45e1-8d2d-287b04287e7d" containerName="extract" Nov 22 03:04:49 crc kubenswrapper[4952]: E1122 03:04:49.225439 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="905bd3d3-e252-45e1-8d2d-287b04287e7d" containerName="util" Nov 22 03:04:49 crc kubenswrapper[4952]: I1122 03:04:49.225444 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="905bd3d3-e252-45e1-8d2d-287b04287e7d" containerName="util" Nov 22 03:04:49 crc kubenswrapper[4952]: E1122 03:04:49.225453 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="905bd3d3-e252-45e1-8d2d-287b04287e7d" containerName="pull" Nov 22 03:04:49 crc kubenswrapper[4952]: I1122 03:04:49.225459 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="905bd3d3-e252-45e1-8d2d-287b04287e7d" containerName="pull" Nov 22 03:04:49 crc kubenswrapper[4952]: I1122 03:04:49.225570 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="905bd3d3-e252-45e1-8d2d-287b04287e7d" containerName="extract" Nov 22 03:04:49 crc kubenswrapper[4952]: I1122 03:04:49.226046 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-n49rs" Nov 22 03:04:49 crc kubenswrapper[4952]: I1122 03:04:49.228525 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 22 03:04:49 crc kubenswrapper[4952]: I1122 03:04:49.228525 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-lj9lp" Nov 22 03:04:49 crc kubenswrapper[4952]: I1122 03:04:49.231928 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 22 03:04:49 crc kubenswrapper[4952]: I1122 03:04:49.247795 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-n49rs"] Nov 22 03:04:49 crc kubenswrapper[4952]: I1122 03:04:49.349394 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdrzm\" (UniqueName: \"kubernetes.io/projected/fcbe36fe-f202-4a96-9016-b1f879fb5384-kube-api-access-cdrzm\") pod \"nmstate-operator-557fdffb88-n49rs\" (UID: \"fcbe36fe-f202-4a96-9016-b1f879fb5384\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-n49rs" Nov 22 03:04:49 crc kubenswrapper[4952]: I1122 03:04:49.451036 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdrzm\" (UniqueName: \"kubernetes.io/projected/fcbe36fe-f202-4a96-9016-b1f879fb5384-kube-api-access-cdrzm\") pod \"nmstate-operator-557fdffb88-n49rs\" (UID: \"fcbe36fe-f202-4a96-9016-b1f879fb5384\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-n49rs" Nov 22 03:04:49 crc kubenswrapper[4952]: I1122 03:04:49.476601 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdrzm\" (UniqueName: \"kubernetes.io/projected/fcbe36fe-f202-4a96-9016-b1f879fb5384-kube-api-access-cdrzm\") pod \"nmstate-operator-557fdffb88-n49rs\" (UID: \"fcbe36fe-f202-4a96-9016-b1f879fb5384\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-n49rs" Nov 22 03:04:49 crc kubenswrapper[4952]: I1122 03:04:49.543349 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-n49rs" Nov 22 03:04:49 crc kubenswrapper[4952]: I1122 03:04:49.767585 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-n49rs"] Nov 22 03:04:49 crc kubenswrapper[4952]: W1122 03:04:49.773235 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcbe36fe_f202_4a96_9016_b1f879fb5384.slice/crio-0b0a967a54a4fbed95fef6f7a8c43d56ee78fded52d1d6c789cfa5df83e3e1da WatchSource:0}: Error finding container 0b0a967a54a4fbed95fef6f7a8c43d56ee78fded52d1d6c789cfa5df83e3e1da: Status 404 returned error can't find the container with id 0b0a967a54a4fbed95fef6f7a8c43d56ee78fded52d1d6c789cfa5df83e3e1da Nov 22 03:04:50 crc kubenswrapper[4952]: I1122 03:04:50.367636 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-n49rs" event={"ID":"fcbe36fe-f202-4a96-9016-b1f879fb5384","Type":"ContainerStarted","Data":"0b0a967a54a4fbed95fef6f7a8c43d56ee78fded52d1d6c789cfa5df83e3e1da"} Nov 22 03:04:52 crc kubenswrapper[4952]: I1122 03:04:52.382177 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-n49rs" event={"ID":"fcbe36fe-f202-4a96-9016-b1f879fb5384","Type":"ContainerStarted","Data":"584f2c03743632f986274c8d41204ae171ac25791d13efcfb3c89e44d3b11fe1"} Nov 22 03:04:52 crc kubenswrapper[4952]: I1122 03:04:52.404331 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-557fdffb88-n49rs" podStartSLOduration=1.273683975 podStartE2EDuration="3.404305464s" podCreationTimestamp="2025-11-22 03:04:49 +0000 UTC" firstStartedPulling="2025-11-22 03:04:49.777738384 +0000 UTC m=+654.083755657" lastFinishedPulling="2025-11-22 03:04:51.908359873 +0000 UTC m=+656.214377146" observedRunningTime="2025-11-22 03:04:52.401574692 +0000 UTC m=+656.707592005" watchObservedRunningTime="2025-11-22 03:04:52.404305464 +0000 UTC m=+656.710322747" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.482399 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-lr7gp"] Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.484054 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-lr7gp" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.489209 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-lr7gp"] Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.489596 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-ckggj" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.528754 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-h7ck5"] Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.529793 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-h7ck5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.534750 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-pp2cj"] Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.535804 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-pp2cj" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.540850 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.564277 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-pp2cj"] Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.611314 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/dccd7a53-7367-4e5a-9c27-0e38d8dce463-dbus-socket\") pod \"nmstate-handler-h7ck5\" (UID: \"dccd7a53-7367-4e5a-9c27-0e38d8dce463\") " pod="openshift-nmstate/nmstate-handler-h7ck5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.611379 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pj89\" (UniqueName: \"kubernetes.io/projected/d55bf0cd-e2d6-4eb3-94a9-3689ee1e2504-kube-api-access-2pj89\") pod \"nmstate-metrics-5dcf9c57c5-lr7gp\" (UID: \"d55bf0cd-e2d6-4eb3-94a9-3689ee1e2504\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-lr7gp" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.611506 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/31032708-369f-4a6a-a5a8-99b4c72c38a1-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-pp2cj\" (UID: \"31032708-369f-4a6a-a5a8-99b4c72c38a1\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-pp2cj" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.611559 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b6s5\" (UniqueName: \"kubernetes.io/projected/31032708-369f-4a6a-a5a8-99b4c72c38a1-kube-api-access-8b6s5\") pod \"nmstate-webhook-6b89b748d8-pp2cj\" (UID: \"31032708-369f-4a6a-a5a8-99b4c72c38a1\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-pp2cj" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.611582 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/dccd7a53-7367-4e5a-9c27-0e38d8dce463-nmstate-lock\") pod \"nmstate-handler-h7ck5\" (UID: \"dccd7a53-7367-4e5a-9c27-0e38d8dce463\") " pod="openshift-nmstate/nmstate-handler-h7ck5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.611648 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/dccd7a53-7367-4e5a-9c27-0e38d8dce463-ovs-socket\") pod \"nmstate-handler-h7ck5\" (UID: \"dccd7a53-7367-4e5a-9c27-0e38d8dce463\") " pod="openshift-nmstate/nmstate-handler-h7ck5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.611725 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g2dz\" (UniqueName: \"kubernetes.io/projected/dccd7a53-7367-4e5a-9c27-0e38d8dce463-kube-api-access-2g2dz\") pod \"nmstate-handler-h7ck5\" (UID: \"dccd7a53-7367-4e5a-9c27-0e38d8dce463\") " pod="openshift-nmstate/nmstate-handler-h7ck5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.647490 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-gm8k5"] Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.653496 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-gm8k5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.655856 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.656611 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.657264 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-5626w" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.665265 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-gm8k5"] Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.713798 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/dccd7a53-7367-4e5a-9c27-0e38d8dce463-dbus-socket\") pod \"nmstate-handler-h7ck5\" (UID: \"dccd7a53-7367-4e5a-9c27-0e38d8dce463\") " pod="openshift-nmstate/nmstate-handler-h7ck5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.713849 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pj89\" (UniqueName: \"kubernetes.io/projected/d55bf0cd-e2d6-4eb3-94a9-3689ee1e2504-kube-api-access-2pj89\") pod \"nmstate-metrics-5dcf9c57c5-lr7gp\" (UID: \"d55bf0cd-e2d6-4eb3-94a9-3689ee1e2504\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-lr7gp" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.713893 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/31032708-369f-4a6a-a5a8-99b4c72c38a1-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-pp2cj\" (UID: \"31032708-369f-4a6a-a5a8-99b4c72c38a1\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-pp2cj" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.713920 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b6s5\" (UniqueName: \"kubernetes.io/projected/31032708-369f-4a6a-a5a8-99b4c72c38a1-kube-api-access-8b6s5\") pod \"nmstate-webhook-6b89b748d8-pp2cj\" (UID: \"31032708-369f-4a6a-a5a8-99b4c72c38a1\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-pp2cj" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.713936 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/dccd7a53-7367-4e5a-9c27-0e38d8dce463-nmstate-lock\") pod \"nmstate-handler-h7ck5\" (UID: \"dccd7a53-7367-4e5a-9c27-0e38d8dce463\") " pod="openshift-nmstate/nmstate-handler-h7ck5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.713960 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/dccd7a53-7367-4e5a-9c27-0e38d8dce463-ovs-socket\") pod \"nmstate-handler-h7ck5\" (UID: \"dccd7a53-7367-4e5a-9c27-0e38d8dce463\") " pod="openshift-nmstate/nmstate-handler-h7ck5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.713991 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g2dz\" (UniqueName: \"kubernetes.io/projected/dccd7a53-7367-4e5a-9c27-0e38d8dce463-kube-api-access-2g2dz\") pod \"nmstate-handler-h7ck5\" (UID: \"dccd7a53-7367-4e5a-9c27-0e38d8dce463\") " pod="openshift-nmstate/nmstate-handler-h7ck5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.714672 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/dccd7a53-7367-4e5a-9c27-0e38d8dce463-dbus-socket\") pod \"nmstate-handler-h7ck5\" (UID: \"dccd7a53-7367-4e5a-9c27-0e38d8dce463\") " pod="openshift-nmstate/nmstate-handler-h7ck5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.715791 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/dccd7a53-7367-4e5a-9c27-0e38d8dce463-nmstate-lock\") pod \"nmstate-handler-h7ck5\" (UID: \"dccd7a53-7367-4e5a-9c27-0e38d8dce463\") " pod="openshift-nmstate/nmstate-handler-h7ck5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.715991 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/dccd7a53-7367-4e5a-9c27-0e38d8dce463-ovs-socket\") pod \"nmstate-handler-h7ck5\" (UID: \"dccd7a53-7367-4e5a-9c27-0e38d8dce463\") " pod="openshift-nmstate/nmstate-handler-h7ck5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.745807 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b6s5\" (UniqueName: \"kubernetes.io/projected/31032708-369f-4a6a-a5a8-99b4c72c38a1-kube-api-access-8b6s5\") pod \"nmstate-webhook-6b89b748d8-pp2cj\" (UID: \"31032708-369f-4a6a-a5a8-99b4c72c38a1\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-pp2cj" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.746373 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/31032708-369f-4a6a-a5a8-99b4c72c38a1-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-pp2cj\" (UID: \"31032708-369f-4a6a-a5a8-99b4c72c38a1\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-pp2cj" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.746943 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g2dz\" (UniqueName: \"kubernetes.io/projected/dccd7a53-7367-4e5a-9c27-0e38d8dce463-kube-api-access-2g2dz\") pod \"nmstate-handler-h7ck5\" (UID: \"dccd7a53-7367-4e5a-9c27-0e38d8dce463\") " pod="openshift-nmstate/nmstate-handler-h7ck5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.747753 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pj89\" (UniqueName: \"kubernetes.io/projected/d55bf0cd-e2d6-4eb3-94a9-3689ee1e2504-kube-api-access-2pj89\") pod \"nmstate-metrics-5dcf9c57c5-lr7gp\" (UID: \"d55bf0cd-e2d6-4eb3-94a9-3689ee1e2504\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-lr7gp" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.801264 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-lr7gp" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.815911 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mrhc\" (UniqueName: \"kubernetes.io/projected/e03eb50e-826c-40bd-9f6b-856c064dd96f-kube-api-access-7mrhc\") pod \"nmstate-console-plugin-5874bd7bc5-gm8k5\" (UID: \"e03eb50e-826c-40bd-9f6b-856c064dd96f\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-gm8k5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.816218 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e03eb50e-826c-40bd-9f6b-856c064dd96f-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-gm8k5\" (UID: \"e03eb50e-826c-40bd-9f6b-856c064dd96f\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-gm8k5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.816299 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e03eb50e-826c-40bd-9f6b-856c064dd96f-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-gm8k5\" (UID: \"e03eb50e-826c-40bd-9f6b-856c064dd96f\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-gm8k5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.853352 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-h7ck5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.877881 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-pp2cj" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.887502 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-576877884-5jgh2"] Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.888533 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.902231 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-576877884-5jgh2"] Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.921812 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mrhc\" (UniqueName: \"kubernetes.io/projected/e03eb50e-826c-40bd-9f6b-856c064dd96f-kube-api-access-7mrhc\") pod \"nmstate-console-plugin-5874bd7bc5-gm8k5\" (UID: \"e03eb50e-826c-40bd-9f6b-856c064dd96f\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-gm8k5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.921919 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e03eb50e-826c-40bd-9f6b-856c064dd96f-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-gm8k5\" (UID: \"e03eb50e-826c-40bd-9f6b-856c064dd96f\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-gm8k5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.921942 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e03eb50e-826c-40bd-9f6b-856c064dd96f-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-gm8k5\" (UID: \"e03eb50e-826c-40bd-9f6b-856c064dd96f\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-gm8k5" Nov 22 03:04:53 crc kubenswrapper[4952]: E1122 03:04:53.922842 4952 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Nov 22 03:04:53 crc kubenswrapper[4952]: E1122 03:04:53.922948 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e03eb50e-826c-40bd-9f6b-856c064dd96f-plugin-serving-cert podName:e03eb50e-826c-40bd-9f6b-856c064dd96f nodeName:}" failed. No retries permitted until 2025-11-22 03:04:54.422921727 +0000 UTC m=+658.728939000 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/e03eb50e-826c-40bd-9f6b-856c064dd96f-plugin-serving-cert") pod "nmstate-console-plugin-5874bd7bc5-gm8k5" (UID: "e03eb50e-826c-40bd-9f6b-856c064dd96f") : secret "plugin-serving-cert" not found Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.923473 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e03eb50e-826c-40bd-9f6b-856c064dd96f-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-gm8k5\" (UID: \"e03eb50e-826c-40bd-9f6b-856c064dd96f\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-gm8k5" Nov 22 03:04:53 crc kubenswrapper[4952]: I1122 03:04:53.971409 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mrhc\" (UniqueName: \"kubernetes.io/projected/e03eb50e-826c-40bd-9f6b-856c064dd96f-kube-api-access-7mrhc\") pod \"nmstate-console-plugin-5874bd7bc5-gm8k5\" (UID: \"e03eb50e-826c-40bd-9f6b-856c064dd96f\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-gm8k5" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.025251 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2fe1b8ff-1736-480d-923c-ad35e5a218e5-console-serving-cert\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.025319 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2fe1b8ff-1736-480d-923c-ad35e5a218e5-service-ca\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.025345 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2fe1b8ff-1736-480d-923c-ad35e5a218e5-console-config\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.025367 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fe1b8ff-1736-480d-923c-ad35e5a218e5-trusted-ca-bundle\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.025404 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5bp5\" (UniqueName: \"kubernetes.io/projected/2fe1b8ff-1736-480d-923c-ad35e5a218e5-kube-api-access-c5bp5\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.025731 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2fe1b8ff-1736-480d-923c-ad35e5a218e5-console-oauth-config\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.025785 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2fe1b8ff-1736-480d-923c-ad35e5a218e5-oauth-serving-cert\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.126771 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-lr7gp"] Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.127292 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2fe1b8ff-1736-480d-923c-ad35e5a218e5-console-oauth-config\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.127344 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2fe1b8ff-1736-480d-923c-ad35e5a218e5-oauth-serving-cert\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.127364 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2fe1b8ff-1736-480d-923c-ad35e5a218e5-console-serving-cert\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.127395 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2fe1b8ff-1736-480d-923c-ad35e5a218e5-service-ca\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.127416 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2fe1b8ff-1736-480d-923c-ad35e5a218e5-console-config\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.127433 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fe1b8ff-1736-480d-923c-ad35e5a218e5-trusted-ca-bundle\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.127470 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5bp5\" (UniqueName: \"kubernetes.io/projected/2fe1b8ff-1736-480d-923c-ad35e5a218e5-kube-api-access-c5bp5\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.129277 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2fe1b8ff-1736-480d-923c-ad35e5a218e5-console-config\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.129284 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2fe1b8ff-1736-480d-923c-ad35e5a218e5-service-ca\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.129559 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2fe1b8ff-1736-480d-923c-ad35e5a218e5-oauth-serving-cert\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.130401 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fe1b8ff-1736-480d-923c-ad35e5a218e5-trusted-ca-bundle\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: W1122 03:04:54.132810 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd55bf0cd_e2d6_4eb3_94a9_3689ee1e2504.slice/crio-04f9b3791a04c6b862b8313709ce6216db716bd43e28022073279f727b339a06 WatchSource:0}: Error finding container 04f9b3791a04c6b862b8313709ce6216db716bd43e28022073279f727b339a06: Status 404 returned error can't find the container with id 04f9b3791a04c6b862b8313709ce6216db716bd43e28022073279f727b339a06 Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.133089 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2fe1b8ff-1736-480d-923c-ad35e5a218e5-console-oauth-config\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.133367 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2fe1b8ff-1736-480d-923c-ad35e5a218e5-console-serving-cert\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.146749 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5bp5\" (UniqueName: \"kubernetes.io/projected/2fe1b8ff-1736-480d-923c-ad35e5a218e5-kube-api-access-c5bp5\") pod \"console-576877884-5jgh2\" (UID: \"2fe1b8ff-1736-480d-923c-ad35e5a218e5\") " pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.187489 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-pp2cj"] Nov 22 03:04:54 crc kubenswrapper[4952]: W1122 03:04:54.190660 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31032708_369f_4a6a_a5a8_99b4c72c38a1.slice/crio-e18b682d968e40f3bc646c13667ec4c2f4dc2d09b787e2badbb85abc49aa03b4 WatchSource:0}: Error finding container e18b682d968e40f3bc646c13667ec4c2f4dc2d09b787e2badbb85abc49aa03b4: Status 404 returned error can't find the container with id e18b682d968e40f3bc646c13667ec4c2f4dc2d09b787e2badbb85abc49aa03b4 Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.211775 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-576877884-5jgh2" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.394581 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-lr7gp" event={"ID":"d55bf0cd-e2d6-4eb3-94a9-3689ee1e2504","Type":"ContainerStarted","Data":"04f9b3791a04c6b862b8313709ce6216db716bd43e28022073279f727b339a06"} Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.395676 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-pp2cj" event={"ID":"31032708-369f-4a6a-a5a8-99b4c72c38a1","Type":"ContainerStarted","Data":"e18b682d968e40f3bc646c13667ec4c2f4dc2d09b787e2badbb85abc49aa03b4"} Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.396687 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-h7ck5" event={"ID":"dccd7a53-7367-4e5a-9c27-0e38d8dce463","Type":"ContainerStarted","Data":"2ece22d9478018a427b3d5b7c298bf1ebd959fbad3d32f92b2a9a1aa8cddc20f"} Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.418674 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-576877884-5jgh2"] Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.430092 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e03eb50e-826c-40bd-9f6b-856c064dd96f-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-gm8k5\" (UID: \"e03eb50e-826c-40bd-9f6b-856c064dd96f\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-gm8k5" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.435130 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e03eb50e-826c-40bd-9f6b-856c064dd96f-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-gm8k5\" (UID: \"e03eb50e-826c-40bd-9f6b-856c064dd96f\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-gm8k5" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.571880 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-gm8k5" Nov 22 03:04:54 crc kubenswrapper[4952]: I1122 03:04:54.771985 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-gm8k5"] Nov 22 03:04:54 crc kubenswrapper[4952]: W1122 03:04:54.785354 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode03eb50e_826c_40bd_9f6b_856c064dd96f.slice/crio-33901af951e8a2e61f764648067097fccd6f98cee8e45c566c50ebdfbcaa7b8c WatchSource:0}: Error finding container 33901af951e8a2e61f764648067097fccd6f98cee8e45c566c50ebdfbcaa7b8c: Status 404 returned error can't find the container with id 33901af951e8a2e61f764648067097fccd6f98cee8e45c566c50ebdfbcaa7b8c Nov 22 03:04:55 crc kubenswrapper[4952]: I1122 03:04:55.404903 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-gm8k5" event={"ID":"e03eb50e-826c-40bd-9f6b-856c064dd96f","Type":"ContainerStarted","Data":"33901af951e8a2e61f764648067097fccd6f98cee8e45c566c50ebdfbcaa7b8c"} Nov 22 03:04:55 crc kubenswrapper[4952]: I1122 03:04:55.407220 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-576877884-5jgh2" event={"ID":"2fe1b8ff-1736-480d-923c-ad35e5a218e5","Type":"ContainerStarted","Data":"6da32c383af0a0b02adff31aa47fd7050ca509c6d47bb6c1a452308f7cf9e62e"} Nov 22 03:04:55 crc kubenswrapper[4952]: I1122 03:04:55.407249 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-576877884-5jgh2" event={"ID":"2fe1b8ff-1736-480d-923c-ad35e5a218e5","Type":"ContainerStarted","Data":"58b8af011fd0ae0bb7aaf62c67a48591fd904418b2526c49e63e3a924bc95279"} Nov 22 03:04:56 crc kubenswrapper[4952]: I1122 03:04:56.554171 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-576877884-5jgh2" podStartSLOduration=3.554150021 podStartE2EDuration="3.554150021s" podCreationTimestamp="2025-11-22 03:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:04:55.427002073 +0000 UTC m=+659.733019356" watchObservedRunningTime="2025-11-22 03:04:56.554150021 +0000 UTC m=+660.860167294" Nov 22 03:04:58 crc kubenswrapper[4952]: I1122 03:04:58.428312 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-lr7gp" event={"ID":"d55bf0cd-e2d6-4eb3-94a9-3689ee1e2504","Type":"ContainerStarted","Data":"7042b20f4e77615239624a5754dc63d9cbd81f7cebf9e5de5ba7e839327ce105"} Nov 22 03:04:58 crc kubenswrapper[4952]: I1122 03:04:58.430083 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-pp2cj" event={"ID":"31032708-369f-4a6a-a5a8-99b4c72c38a1","Type":"ContainerStarted","Data":"d528241836725ad6cc2f2410085ae4d45cb8731029b6423411224c553c60d707"} Nov 22 03:04:58 crc kubenswrapper[4952]: I1122 03:04:58.430245 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-pp2cj" Nov 22 03:04:58 crc kubenswrapper[4952]: I1122 03:04:58.432701 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-h7ck5" event={"ID":"dccd7a53-7367-4e5a-9c27-0e38d8dce463","Type":"ContainerStarted","Data":"d6c6dde93dd6731e2606ac4a48c7a5b53a4734e64e98301d149d5023323a4fb2"} Nov 22 03:04:58 crc kubenswrapper[4952]: I1122 03:04:58.432892 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-h7ck5" Nov 22 03:04:58 crc kubenswrapper[4952]: I1122 03:04:58.435399 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-gm8k5" event={"ID":"e03eb50e-826c-40bd-9f6b-856c064dd96f","Type":"ContainerStarted","Data":"6a69401e0bb0b265ada3db7409aec00826a38672b5d26d426907e777239a335f"} Nov 22 03:04:58 crc kubenswrapper[4952]: I1122 03:04:58.457726 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-pp2cj" podStartSLOduration=2.141444863 podStartE2EDuration="5.457699964s" podCreationTimestamp="2025-11-22 03:04:53 +0000 UTC" firstStartedPulling="2025-11-22 03:04:54.193060327 +0000 UTC m=+658.499077600" lastFinishedPulling="2025-11-22 03:04:57.509315438 +0000 UTC m=+661.815332701" observedRunningTime="2025-11-22 03:04:58.448076259 +0000 UTC m=+662.754093532" watchObservedRunningTime="2025-11-22 03:04:58.457699964 +0000 UTC m=+662.763717237" Nov 22 03:04:58 crc kubenswrapper[4952]: I1122 03:04:58.478038 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-h7ck5" podStartSLOduration=2.069278315 podStartE2EDuration="5.478010254s" podCreationTimestamp="2025-11-22 03:04:53 +0000 UTC" firstStartedPulling="2025-11-22 03:04:53.950621843 +0000 UTC m=+658.256639116" lastFinishedPulling="2025-11-22 03:04:57.359353762 +0000 UTC m=+661.665371055" observedRunningTime="2025-11-22 03:04:58.471717997 +0000 UTC m=+662.777735290" watchObservedRunningTime="2025-11-22 03:04:58.478010254 +0000 UTC m=+662.784027537" Nov 22 03:04:58 crc kubenswrapper[4952]: I1122 03:04:58.493766 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-gm8k5" podStartSLOduration=2.802062971 podStartE2EDuration="5.493743032s" podCreationTimestamp="2025-11-22 03:04:53 +0000 UTC" firstStartedPulling="2025-11-22 03:04:54.788086612 +0000 UTC m=+659.094103885" lastFinishedPulling="2025-11-22 03:04:57.479766673 +0000 UTC m=+661.785783946" observedRunningTime="2025-11-22 03:04:58.491689708 +0000 UTC m=+662.797706981" watchObservedRunningTime="2025-11-22 03:04:58.493743032 +0000 UTC m=+662.799760305" Nov 22 03:05:00 crc kubenswrapper[4952]: I1122 03:05:00.454142 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-lr7gp" event={"ID":"d55bf0cd-e2d6-4eb3-94a9-3689ee1e2504","Type":"ContainerStarted","Data":"683a513601d238309843e80179bb4a0274045bdbc9fa1a029a42d840678e54cb"} Nov 22 03:05:00 crc kubenswrapper[4952]: I1122 03:05:00.474959 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-lr7gp" podStartSLOduration=1.756721258 podStartE2EDuration="7.47493275s" podCreationTimestamp="2025-11-22 03:04:53 +0000 UTC" firstStartedPulling="2025-11-22 03:04:54.136737029 +0000 UTC m=+658.442754302" lastFinishedPulling="2025-11-22 03:04:59.854948521 +0000 UTC m=+664.160965794" observedRunningTime="2025-11-22 03:05:00.474058027 +0000 UTC m=+664.780075330" watchObservedRunningTime="2025-11-22 03:05:00.47493275 +0000 UTC m=+664.780950023" Nov 22 03:05:03 crc kubenswrapper[4952]: I1122 03:05:03.880321 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-h7ck5" Nov 22 03:05:04 crc kubenswrapper[4952]: I1122 03:05:04.212743 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-576877884-5jgh2" Nov 22 03:05:04 crc kubenswrapper[4952]: I1122 03:05:04.212870 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-576877884-5jgh2" Nov 22 03:05:04 crc kubenswrapper[4952]: I1122 03:05:04.223071 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-576877884-5jgh2" Nov 22 03:05:04 crc kubenswrapper[4952]: I1122 03:05:04.498747 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-576877884-5jgh2" Nov 22 03:05:04 crc kubenswrapper[4952]: I1122 03:05:04.586810 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-89rxq"] Nov 22 03:05:13 crc kubenswrapper[4952]: I1122 03:05:13.889506 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-pp2cj" Nov 22 03:05:29 crc kubenswrapper[4952]: I1122 03:05:29.670871 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-89rxq" podUID="aae47c6e-1d61-40ec-851a-c3e5a6242dcc" containerName="console" containerID="cri-o://cef0ad02145738f10f5b670bc68c23af29352bef69e6a23a56c736032313fe9b" gracePeriod=15 Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.093429 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-89rxq_aae47c6e-1d61-40ec-851a-c3e5a6242dcc/console/0.log" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.093901 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-89rxq" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.186943 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-oauth-serving-cert\") pod \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.186998 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-trusted-ca-bundle\") pod \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.187037 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-console-oauth-config\") pod \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.187087 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-console-config\") pod \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.187139 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-console-serving-cert\") pod \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.187186 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-service-ca\") pod \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.187213 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ztvq\" (UniqueName: \"kubernetes.io/projected/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-kube-api-access-5ztvq\") pod \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\" (UID: \"aae47c6e-1d61-40ec-851a-c3e5a6242dcc\") " Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.188038 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "aae47c6e-1d61-40ec-851a-c3e5a6242dcc" (UID: "aae47c6e-1d61-40ec-851a-c3e5a6242dcc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.188256 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "aae47c6e-1d61-40ec-851a-c3e5a6242dcc" (UID: "aae47c6e-1d61-40ec-851a-c3e5a6242dcc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.188401 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-service-ca" (OuterVolumeSpecName: "service-ca") pod "aae47c6e-1d61-40ec-851a-c3e5a6242dcc" (UID: "aae47c6e-1d61-40ec-851a-c3e5a6242dcc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.189137 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-console-config" (OuterVolumeSpecName: "console-config") pod "aae47c6e-1d61-40ec-851a-c3e5a6242dcc" (UID: "aae47c6e-1d61-40ec-851a-c3e5a6242dcc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.194422 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "aae47c6e-1d61-40ec-851a-c3e5a6242dcc" (UID: "aae47c6e-1d61-40ec-851a-c3e5a6242dcc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.204399 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-kube-api-access-5ztvq" (OuterVolumeSpecName: "kube-api-access-5ztvq") pod "aae47c6e-1d61-40ec-851a-c3e5a6242dcc" (UID: "aae47c6e-1d61-40ec-851a-c3e5a6242dcc"). InnerVolumeSpecName "kube-api-access-5ztvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.210072 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "aae47c6e-1d61-40ec-851a-c3e5a6242dcc" (UID: "aae47c6e-1d61-40ec-851a-c3e5a6242dcc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.288990 4952 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.289038 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ztvq\" (UniqueName: \"kubernetes.io/projected/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-kube-api-access-5ztvq\") on node \"crc\" DevicePath \"\"" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.289052 4952 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.289062 4952 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.289070 4952 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.289079 4952 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-console-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.289088 4952 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aae47c6e-1d61-40ec-851a-c3e5a6242dcc-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.704226 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-89rxq_aae47c6e-1d61-40ec-851a-c3e5a6242dcc/console/0.log" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.704285 4952 generic.go:334] "Generic (PLEG): container finished" podID="aae47c6e-1d61-40ec-851a-c3e5a6242dcc" containerID="cef0ad02145738f10f5b670bc68c23af29352bef69e6a23a56c736032313fe9b" exitCode=2 Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.704330 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-89rxq" event={"ID":"aae47c6e-1d61-40ec-851a-c3e5a6242dcc","Type":"ContainerDied","Data":"cef0ad02145738f10f5b670bc68c23af29352bef69e6a23a56c736032313fe9b"} Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.704372 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-89rxq" event={"ID":"aae47c6e-1d61-40ec-851a-c3e5a6242dcc","Type":"ContainerDied","Data":"a7c29d4b60a5a64f7b3449f0a707c1ae4c4a7216abb589f3d40baa8855edcc77"} Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.704393 4952 scope.go:117] "RemoveContainer" containerID="cef0ad02145738f10f5b670bc68c23af29352bef69e6a23a56c736032313fe9b" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.704409 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-89rxq" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.723293 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-89rxq"] Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.727389 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-89rxq"] Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.733565 4952 scope.go:117] "RemoveContainer" containerID="cef0ad02145738f10f5b670bc68c23af29352bef69e6a23a56c736032313fe9b" Nov 22 03:05:30 crc kubenswrapper[4952]: E1122 03:05:30.734868 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cef0ad02145738f10f5b670bc68c23af29352bef69e6a23a56c736032313fe9b\": container with ID starting with cef0ad02145738f10f5b670bc68c23af29352bef69e6a23a56c736032313fe9b not found: ID does not exist" containerID="cef0ad02145738f10f5b670bc68c23af29352bef69e6a23a56c736032313fe9b" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.734948 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cef0ad02145738f10f5b670bc68c23af29352bef69e6a23a56c736032313fe9b"} err="failed to get container status \"cef0ad02145738f10f5b670bc68c23af29352bef69e6a23a56c736032313fe9b\": rpc error: code = NotFound desc = could not find container \"cef0ad02145738f10f5b670bc68c23af29352bef69e6a23a56c736032313fe9b\": container with ID starting with cef0ad02145738f10f5b670bc68c23af29352bef69e6a23a56c736032313fe9b not found: ID does not exist" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.819532 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2"] Nov 22 03:05:30 crc kubenswrapper[4952]: E1122 03:05:30.819891 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae47c6e-1d61-40ec-851a-c3e5a6242dcc" containerName="console" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.819907 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae47c6e-1d61-40ec-851a-c3e5a6242dcc" containerName="console" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.820011 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae47c6e-1d61-40ec-851a-c3e5a6242dcc" containerName="console" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.820905 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.823644 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.830066 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2"] Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.896903 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw227\" (UniqueName: \"kubernetes.io/projected/59fd7e52-0ba7-42f3-b749-493cf9c5b8d2-kube-api-access-vw227\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2\" (UID: \"59fd7e52-0ba7-42f3-b749-493cf9c5b8d2\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.896960 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59fd7e52-0ba7-42f3-b749-493cf9c5b8d2-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2\" (UID: \"59fd7e52-0ba7-42f3-b749-493cf9c5b8d2\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.897018 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59fd7e52-0ba7-42f3-b749-493cf9c5b8d2-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2\" (UID: \"59fd7e52-0ba7-42f3-b749-493cf9c5b8d2\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.998964 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59fd7e52-0ba7-42f3-b749-493cf9c5b8d2-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2\" (UID: \"59fd7e52-0ba7-42f3-b749-493cf9c5b8d2\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.999024 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw227\" (UniqueName: \"kubernetes.io/projected/59fd7e52-0ba7-42f3-b749-493cf9c5b8d2-kube-api-access-vw227\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2\" (UID: \"59fd7e52-0ba7-42f3-b749-493cf9c5b8d2\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.999055 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59fd7e52-0ba7-42f3-b749-493cf9c5b8d2-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2\" (UID: \"59fd7e52-0ba7-42f3-b749-493cf9c5b8d2\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.999499 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59fd7e52-0ba7-42f3-b749-493cf9c5b8d2-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2\" (UID: \"59fd7e52-0ba7-42f3-b749-493cf9c5b8d2\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2" Nov 22 03:05:30 crc kubenswrapper[4952]: I1122 03:05:30.999661 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59fd7e52-0ba7-42f3-b749-493cf9c5b8d2-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2\" (UID: \"59fd7e52-0ba7-42f3-b749-493cf9c5b8d2\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2" Nov 22 03:05:31 crc kubenswrapper[4952]: I1122 03:05:31.017240 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw227\" (UniqueName: \"kubernetes.io/projected/59fd7e52-0ba7-42f3-b749-493cf9c5b8d2-kube-api-access-vw227\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2\" (UID: \"59fd7e52-0ba7-42f3-b749-493cf9c5b8d2\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2" Nov 22 03:05:31 crc kubenswrapper[4952]: I1122 03:05:31.143350 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2" Nov 22 03:05:31 crc kubenswrapper[4952]: I1122 03:05:31.337223 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2"] Nov 22 03:05:31 crc kubenswrapper[4952]: I1122 03:05:31.712582 4952 generic.go:334] "Generic (PLEG): container finished" podID="59fd7e52-0ba7-42f3-b749-493cf9c5b8d2" containerID="f2ea340e7be975055c65a888fe1ae4ee0de7d0ca72bd9a0738654d93bb25c7c2" exitCode=0 Nov 22 03:05:31 crc kubenswrapper[4952]: I1122 03:05:31.712712 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2" event={"ID":"59fd7e52-0ba7-42f3-b749-493cf9c5b8d2","Type":"ContainerDied","Data":"f2ea340e7be975055c65a888fe1ae4ee0de7d0ca72bd9a0738654d93bb25c7c2"} Nov 22 03:05:31 crc kubenswrapper[4952]: I1122 03:05:31.713046 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2" event={"ID":"59fd7e52-0ba7-42f3-b749-493cf9c5b8d2","Type":"ContainerStarted","Data":"d6e0c4d7daf2446417556a7aad4b3628f1b4a06e90c5ed3af8c153ca12e583d8"} Nov 22 03:05:32 crc kubenswrapper[4952]: I1122 03:05:32.544529 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aae47c6e-1d61-40ec-851a-c3e5a6242dcc" path="/var/lib/kubelet/pods/aae47c6e-1d61-40ec-851a-c3e5a6242dcc/volumes" Nov 22 03:05:33 crc kubenswrapper[4952]: I1122 03:05:33.733053 4952 generic.go:334] "Generic (PLEG): container finished" podID="59fd7e52-0ba7-42f3-b749-493cf9c5b8d2" containerID="0401cdc548b43e4fb6bb1f872cb20cc977b80487f8366b9c15916d166e91ba63" exitCode=0 Nov 22 03:05:33 crc kubenswrapper[4952]: I1122 03:05:33.733137 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2" event={"ID":"59fd7e52-0ba7-42f3-b749-493cf9c5b8d2","Type":"ContainerDied","Data":"0401cdc548b43e4fb6bb1f872cb20cc977b80487f8366b9c15916d166e91ba63"} Nov 22 03:05:34 crc kubenswrapper[4952]: I1122 03:05:34.744035 4952 generic.go:334] "Generic (PLEG): container finished" podID="59fd7e52-0ba7-42f3-b749-493cf9c5b8d2" containerID="c31760593646e7b853732ce680c86f2489ff172158fe5a737edba9efc2afcd91" exitCode=0 Nov 22 03:05:34 crc kubenswrapper[4952]: I1122 03:05:34.744114 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2" event={"ID":"59fd7e52-0ba7-42f3-b749-493cf9c5b8d2","Type":"ContainerDied","Data":"c31760593646e7b853732ce680c86f2489ff172158fe5a737edba9efc2afcd91"} Nov 22 03:05:35 crc kubenswrapper[4952]: I1122 03:05:35.997004 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2" Nov 22 03:05:36 crc kubenswrapper[4952]: I1122 03:05:36.175596 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59fd7e52-0ba7-42f3-b749-493cf9c5b8d2-bundle\") pod \"59fd7e52-0ba7-42f3-b749-493cf9c5b8d2\" (UID: \"59fd7e52-0ba7-42f3-b749-493cf9c5b8d2\") " Nov 22 03:05:36 crc kubenswrapper[4952]: I1122 03:05:36.175721 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw227\" (UniqueName: \"kubernetes.io/projected/59fd7e52-0ba7-42f3-b749-493cf9c5b8d2-kube-api-access-vw227\") pod \"59fd7e52-0ba7-42f3-b749-493cf9c5b8d2\" (UID: \"59fd7e52-0ba7-42f3-b749-493cf9c5b8d2\") " Nov 22 03:05:36 crc kubenswrapper[4952]: I1122 03:05:36.175764 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59fd7e52-0ba7-42f3-b749-493cf9c5b8d2-util\") pod \"59fd7e52-0ba7-42f3-b749-493cf9c5b8d2\" (UID: \"59fd7e52-0ba7-42f3-b749-493cf9c5b8d2\") " Nov 22 03:05:36 crc kubenswrapper[4952]: I1122 03:05:36.176642 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59fd7e52-0ba7-42f3-b749-493cf9c5b8d2-bundle" (OuterVolumeSpecName: "bundle") pod "59fd7e52-0ba7-42f3-b749-493cf9c5b8d2" (UID: "59fd7e52-0ba7-42f3-b749-493cf9c5b8d2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:05:36 crc kubenswrapper[4952]: I1122 03:05:36.181687 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59fd7e52-0ba7-42f3-b749-493cf9c5b8d2-kube-api-access-vw227" (OuterVolumeSpecName: "kube-api-access-vw227") pod "59fd7e52-0ba7-42f3-b749-493cf9c5b8d2" (UID: "59fd7e52-0ba7-42f3-b749-493cf9c5b8d2"). InnerVolumeSpecName "kube-api-access-vw227". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:05:36 crc kubenswrapper[4952]: I1122 03:05:36.191349 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59fd7e52-0ba7-42f3-b749-493cf9c5b8d2-util" (OuterVolumeSpecName: "util") pod "59fd7e52-0ba7-42f3-b749-493cf9c5b8d2" (UID: "59fd7e52-0ba7-42f3-b749-493cf9c5b8d2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:05:36 crc kubenswrapper[4952]: I1122 03:05:36.277848 4952 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59fd7e52-0ba7-42f3-b749-493cf9c5b8d2-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:05:36 crc kubenswrapper[4952]: I1122 03:05:36.277889 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw227\" (UniqueName: \"kubernetes.io/projected/59fd7e52-0ba7-42f3-b749-493cf9c5b8d2-kube-api-access-vw227\") on node \"crc\" DevicePath \"\"" Nov 22 03:05:36 crc kubenswrapper[4952]: I1122 03:05:36.277902 4952 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59fd7e52-0ba7-42f3-b749-493cf9c5b8d2-util\") on node \"crc\" DevicePath \"\"" Nov 22 03:05:36 crc kubenswrapper[4952]: I1122 03:05:36.776704 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2" event={"ID":"59fd7e52-0ba7-42f3-b749-493cf9c5b8d2","Type":"ContainerDied","Data":"d6e0c4d7daf2446417556a7aad4b3628f1b4a06e90c5ed3af8c153ca12e583d8"} Nov 22 03:05:36 crc kubenswrapper[4952]: I1122 03:05:36.776786 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6e0c4d7daf2446417556a7aad4b3628f1b4a06e90c5ed3af8c153ca12e583d8" Nov 22 03:05:36 crc kubenswrapper[4952]: I1122 03:05:36.776808 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.278152 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6cdd766b96-rgt84"] Nov 22 03:05:45 crc kubenswrapper[4952]: E1122 03:05:45.279078 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59fd7e52-0ba7-42f3-b749-493cf9c5b8d2" containerName="pull" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.279093 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="59fd7e52-0ba7-42f3-b749-493cf9c5b8d2" containerName="pull" Nov 22 03:05:45 crc kubenswrapper[4952]: E1122 03:05:45.279106 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59fd7e52-0ba7-42f3-b749-493cf9c5b8d2" containerName="extract" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.279114 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="59fd7e52-0ba7-42f3-b749-493cf9c5b8d2" containerName="extract" Nov 22 03:05:45 crc kubenswrapper[4952]: E1122 03:05:45.279125 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59fd7e52-0ba7-42f3-b749-493cf9c5b8d2" containerName="util" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.279133 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="59fd7e52-0ba7-42f3-b749-493cf9c5b8d2" containerName="util" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.279265 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="59fd7e52-0ba7-42f3-b749-493cf9c5b8d2" containerName="extract" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.279792 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6cdd766b96-rgt84" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.282280 4952 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.283053 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.283083 4952 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-bm7gl" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.283107 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.283189 4952 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.293065 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6cdd766b96-rgt84"] Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.331455 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/96df557a-4a41-48dc-bbea-961acc5fd4db-webhook-cert\") pod \"metallb-operator-controller-manager-6cdd766b96-rgt84\" (UID: \"96df557a-4a41-48dc-bbea-961acc5fd4db\") " pod="metallb-system/metallb-operator-controller-manager-6cdd766b96-rgt84" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.331659 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/96df557a-4a41-48dc-bbea-961acc5fd4db-apiservice-cert\") pod \"metallb-operator-controller-manager-6cdd766b96-rgt84\" (UID: \"96df557a-4a41-48dc-bbea-961acc5fd4db\") " pod="metallb-system/metallb-operator-controller-manager-6cdd766b96-rgt84" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.331709 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzr4t\" (UniqueName: \"kubernetes.io/projected/96df557a-4a41-48dc-bbea-961acc5fd4db-kube-api-access-kzr4t\") pod \"metallb-operator-controller-manager-6cdd766b96-rgt84\" (UID: \"96df557a-4a41-48dc-bbea-961acc5fd4db\") " pod="metallb-system/metallb-operator-controller-manager-6cdd766b96-rgt84" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.433508 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/96df557a-4a41-48dc-bbea-961acc5fd4db-apiservice-cert\") pod \"metallb-operator-controller-manager-6cdd766b96-rgt84\" (UID: \"96df557a-4a41-48dc-bbea-961acc5fd4db\") " pod="metallb-system/metallb-operator-controller-manager-6cdd766b96-rgt84" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.433653 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzr4t\" (UniqueName: \"kubernetes.io/projected/96df557a-4a41-48dc-bbea-961acc5fd4db-kube-api-access-kzr4t\") pod \"metallb-operator-controller-manager-6cdd766b96-rgt84\" (UID: \"96df557a-4a41-48dc-bbea-961acc5fd4db\") " pod="metallb-system/metallb-operator-controller-manager-6cdd766b96-rgt84" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.433718 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/96df557a-4a41-48dc-bbea-961acc5fd4db-webhook-cert\") pod \"metallb-operator-controller-manager-6cdd766b96-rgt84\" (UID: \"96df557a-4a41-48dc-bbea-961acc5fd4db\") " pod="metallb-system/metallb-operator-controller-manager-6cdd766b96-rgt84" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.448334 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/96df557a-4a41-48dc-bbea-961acc5fd4db-webhook-cert\") pod \"metallb-operator-controller-manager-6cdd766b96-rgt84\" (UID: \"96df557a-4a41-48dc-bbea-961acc5fd4db\") " pod="metallb-system/metallb-operator-controller-manager-6cdd766b96-rgt84" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.465986 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/96df557a-4a41-48dc-bbea-961acc5fd4db-apiservice-cert\") pod \"metallb-operator-controller-manager-6cdd766b96-rgt84\" (UID: \"96df557a-4a41-48dc-bbea-961acc5fd4db\") " pod="metallb-system/metallb-operator-controller-manager-6cdd766b96-rgt84" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.494940 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzr4t\" (UniqueName: \"kubernetes.io/projected/96df557a-4a41-48dc-bbea-961acc5fd4db-kube-api-access-kzr4t\") pod \"metallb-operator-controller-manager-6cdd766b96-rgt84\" (UID: \"96df557a-4a41-48dc-bbea-961acc5fd4db\") " pod="metallb-system/metallb-operator-controller-manager-6cdd766b96-rgt84" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.602630 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6cdd766b96-rgt84" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.721090 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-c64b8d58d-nfvjm"] Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.738108 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-c64b8d58d-nfvjm" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.754697 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-c64b8d58d-nfvjm"] Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.759563 4952 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-h88mn" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.759786 4952 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.759829 4952 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.842054 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3bd69787-c3ae-450c-9854-1b7b9b54b379-webhook-cert\") pod \"metallb-operator-webhook-server-c64b8d58d-nfvjm\" (UID: \"3bd69787-c3ae-450c-9854-1b7b9b54b379\") " pod="metallb-system/metallb-operator-webhook-server-c64b8d58d-nfvjm" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.842151 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grb9n\" (UniqueName: \"kubernetes.io/projected/3bd69787-c3ae-450c-9854-1b7b9b54b379-kube-api-access-grb9n\") pod \"metallb-operator-webhook-server-c64b8d58d-nfvjm\" (UID: \"3bd69787-c3ae-450c-9854-1b7b9b54b379\") " pod="metallb-system/metallb-operator-webhook-server-c64b8d58d-nfvjm" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.842193 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3bd69787-c3ae-450c-9854-1b7b9b54b379-apiservice-cert\") pod \"metallb-operator-webhook-server-c64b8d58d-nfvjm\" (UID: \"3bd69787-c3ae-450c-9854-1b7b9b54b379\") " pod="metallb-system/metallb-operator-webhook-server-c64b8d58d-nfvjm" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.919091 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6cdd766b96-rgt84"] Nov 22 03:05:45 crc kubenswrapper[4952]: W1122 03:05:45.934346 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96df557a_4a41_48dc_bbea_961acc5fd4db.slice/crio-3a3244e2f553797a967bbd6d0657347012e8742588306d1c8eeb814241d7ae20 WatchSource:0}: Error finding container 3a3244e2f553797a967bbd6d0657347012e8742588306d1c8eeb814241d7ae20: Status 404 returned error can't find the container with id 3a3244e2f553797a967bbd6d0657347012e8742588306d1c8eeb814241d7ae20 Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.943664 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grb9n\" (UniqueName: \"kubernetes.io/projected/3bd69787-c3ae-450c-9854-1b7b9b54b379-kube-api-access-grb9n\") pod \"metallb-operator-webhook-server-c64b8d58d-nfvjm\" (UID: \"3bd69787-c3ae-450c-9854-1b7b9b54b379\") " pod="metallb-system/metallb-operator-webhook-server-c64b8d58d-nfvjm" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.943739 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3bd69787-c3ae-450c-9854-1b7b9b54b379-apiservice-cert\") pod \"metallb-operator-webhook-server-c64b8d58d-nfvjm\" (UID: \"3bd69787-c3ae-450c-9854-1b7b9b54b379\") " pod="metallb-system/metallb-operator-webhook-server-c64b8d58d-nfvjm" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.943830 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3bd69787-c3ae-450c-9854-1b7b9b54b379-webhook-cert\") pod \"metallb-operator-webhook-server-c64b8d58d-nfvjm\" (UID: \"3bd69787-c3ae-450c-9854-1b7b9b54b379\") " pod="metallb-system/metallb-operator-webhook-server-c64b8d58d-nfvjm" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.949801 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3bd69787-c3ae-450c-9854-1b7b9b54b379-webhook-cert\") pod \"metallb-operator-webhook-server-c64b8d58d-nfvjm\" (UID: \"3bd69787-c3ae-450c-9854-1b7b9b54b379\") " pod="metallb-system/metallb-operator-webhook-server-c64b8d58d-nfvjm" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.950592 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3bd69787-c3ae-450c-9854-1b7b9b54b379-apiservice-cert\") pod \"metallb-operator-webhook-server-c64b8d58d-nfvjm\" (UID: \"3bd69787-c3ae-450c-9854-1b7b9b54b379\") " pod="metallb-system/metallb-operator-webhook-server-c64b8d58d-nfvjm" Nov 22 03:05:45 crc kubenswrapper[4952]: I1122 03:05:45.968159 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grb9n\" (UniqueName: \"kubernetes.io/projected/3bd69787-c3ae-450c-9854-1b7b9b54b379-kube-api-access-grb9n\") pod \"metallb-operator-webhook-server-c64b8d58d-nfvjm\" (UID: \"3bd69787-c3ae-450c-9854-1b7b9b54b379\") " pod="metallb-system/metallb-operator-webhook-server-c64b8d58d-nfvjm" Nov 22 03:05:46 crc kubenswrapper[4952]: I1122 03:05:46.087000 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-c64b8d58d-nfvjm" Nov 22 03:05:46 crc kubenswrapper[4952]: I1122 03:05:46.562275 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-c64b8d58d-nfvjm"] Nov 22 03:05:46 crc kubenswrapper[4952]: I1122 03:05:46.849134 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-c64b8d58d-nfvjm" event={"ID":"3bd69787-c3ae-450c-9854-1b7b9b54b379","Type":"ContainerStarted","Data":"6dca6586c20b4bacb21c89b13d724b3d3241bc8e306688d0d15a6b171fc4c60f"} Nov 22 03:05:46 crc kubenswrapper[4952]: I1122 03:05:46.850872 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6cdd766b96-rgt84" event={"ID":"96df557a-4a41-48dc-bbea-961acc5fd4db","Type":"ContainerStarted","Data":"3a3244e2f553797a967bbd6d0657347012e8742588306d1c8eeb814241d7ae20"} Nov 22 03:05:49 crc kubenswrapper[4952]: I1122 03:05:49.874842 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6cdd766b96-rgt84" event={"ID":"96df557a-4a41-48dc-bbea-961acc5fd4db","Type":"ContainerStarted","Data":"68f083412db858ad83df6d5e11725241f1aa223f3d034ba8c8045abf582ca77b"} Nov 22 03:05:49 crc kubenswrapper[4952]: I1122 03:05:49.876730 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6cdd766b96-rgt84" Nov 22 03:05:49 crc kubenswrapper[4952]: I1122 03:05:49.911061 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6cdd766b96-rgt84" podStartSLOduration=1.361554108 podStartE2EDuration="4.911031708s" podCreationTimestamp="2025-11-22 03:05:45 +0000 UTC" firstStartedPulling="2025-11-22 03:05:45.937104787 +0000 UTC m=+710.243122060" lastFinishedPulling="2025-11-22 03:05:49.486582387 +0000 UTC m=+713.792599660" observedRunningTime="2025-11-22 03:05:49.8990456 +0000 UTC m=+714.205062893" watchObservedRunningTime="2025-11-22 03:05:49.911031708 +0000 UTC m=+714.217049001" Nov 22 03:05:51 crc kubenswrapper[4952]: I1122 03:05:51.890927 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-c64b8d58d-nfvjm" event={"ID":"3bd69787-c3ae-450c-9854-1b7b9b54b379","Type":"ContainerStarted","Data":"b2c98e354be6b4dd1d7716b3ef81b5227db1e976ad176778f559f3d7483704bc"} Nov 22 03:05:51 crc kubenswrapper[4952]: I1122 03:05:51.917033 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-c64b8d58d-nfvjm" podStartSLOduration=1.961937163 podStartE2EDuration="6.917007096s" podCreationTimestamp="2025-11-22 03:05:45 +0000 UTC" firstStartedPulling="2025-11-22 03:05:46.579313937 +0000 UTC m=+710.885331210" lastFinishedPulling="2025-11-22 03:05:51.53438387 +0000 UTC m=+715.840401143" observedRunningTime="2025-11-22 03:05:51.914885479 +0000 UTC m=+716.220902792" watchObservedRunningTime="2025-11-22 03:05:51.917007096 +0000 UTC m=+716.223024369" Nov 22 03:05:52 crc kubenswrapper[4952]: I1122 03:05:52.899593 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-c64b8d58d-nfvjm" Nov 22 03:05:58 crc kubenswrapper[4952]: I1122 03:05:58.342480 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:05:58 crc kubenswrapper[4952]: I1122 03:05:58.343100 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:06:06 crc kubenswrapper[4952]: I1122 03:06:06.095392 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-c64b8d58d-nfvjm" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.167766 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z6kn5"] Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.168956 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" podUID="63bdcada-d1ad-45eb-b290-42b2b8dd8257" containerName="controller-manager" containerID="cri-o://f2edcfe1fb5697c939e663e602f25f42e12bda9d287671aa108b49f40818da8e" gracePeriod=30 Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.265513 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx"] Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.265861 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" podUID="3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276" containerName="route-controller-manager" containerID="cri-o://541c6f268d044648601b84df32a9f3d02ec82d32d5a0dbe03cee3f5eaa228b89" gracePeriod=30 Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.607531 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6cdd766b96-rgt84" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.687451 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.744808 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.888249 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-config\") pod \"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276\" (UID: \"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276\") " Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.888311 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63bdcada-d1ad-45eb-b290-42b2b8dd8257-serving-cert\") pod \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\" (UID: \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\") " Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.888342 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63bdcada-d1ad-45eb-b290-42b2b8dd8257-client-ca\") pod \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\" (UID: \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\") " Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.888376 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63bdcada-d1ad-45eb-b290-42b2b8dd8257-config\") pod \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\" (UID: \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\") " Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.888414 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcf88\" (UniqueName: \"kubernetes.io/projected/63bdcada-d1ad-45eb-b290-42b2b8dd8257-kube-api-access-fcf88\") pod \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\" (UID: \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\") " Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.888442 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/63bdcada-d1ad-45eb-b290-42b2b8dd8257-proxy-ca-bundles\") pod \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\" (UID: \"63bdcada-d1ad-45eb-b290-42b2b8dd8257\") " Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.888484 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-client-ca\") pod \"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276\" (UID: \"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276\") " Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.888538 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-864dz\" (UniqueName: \"kubernetes.io/projected/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-kube-api-access-864dz\") pod \"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276\" (UID: \"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276\") " Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.888582 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-serving-cert\") pod \"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276\" (UID: \"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276\") " Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.889140 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63bdcada-d1ad-45eb-b290-42b2b8dd8257-client-ca" (OuterVolumeSpecName: "client-ca") pod "63bdcada-d1ad-45eb-b290-42b2b8dd8257" (UID: "63bdcada-d1ad-45eb-b290-42b2b8dd8257"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.889586 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-client-ca" (OuterVolumeSpecName: "client-ca") pod "3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276" (UID: "3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.889611 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63bdcada-d1ad-45eb-b290-42b2b8dd8257-config" (OuterVolumeSpecName: "config") pod "63bdcada-d1ad-45eb-b290-42b2b8dd8257" (UID: "63bdcada-d1ad-45eb-b290-42b2b8dd8257"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.889578 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63bdcada-d1ad-45eb-b290-42b2b8dd8257-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "63bdcada-d1ad-45eb-b290-42b2b8dd8257" (UID: "63bdcada-d1ad-45eb-b290-42b2b8dd8257"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.889715 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-config" (OuterVolumeSpecName: "config") pod "3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276" (UID: "3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.894597 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63bdcada-d1ad-45eb-b290-42b2b8dd8257-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "63bdcada-d1ad-45eb-b290-42b2b8dd8257" (UID: "63bdcada-d1ad-45eb-b290-42b2b8dd8257"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.894849 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63bdcada-d1ad-45eb-b290-42b2b8dd8257-kube-api-access-fcf88" (OuterVolumeSpecName: "kube-api-access-fcf88") pod "63bdcada-d1ad-45eb-b290-42b2b8dd8257" (UID: "63bdcada-d1ad-45eb-b290-42b2b8dd8257"). InnerVolumeSpecName "kube-api-access-fcf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.894894 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276" (UID: "3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.894938 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-kube-api-access-864dz" (OuterVolumeSpecName: "kube-api-access-864dz") pod "3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276" (UID: "3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276"). InnerVolumeSpecName "kube-api-access-864dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.990218 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.990256 4952 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63bdcada-d1ad-45eb-b290-42b2b8dd8257-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.990269 4952 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63bdcada-d1ad-45eb-b290-42b2b8dd8257-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.990278 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63bdcada-d1ad-45eb-b290-42b2b8dd8257-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.990288 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcf88\" (UniqueName: \"kubernetes.io/projected/63bdcada-d1ad-45eb-b290-42b2b8dd8257-kube-api-access-fcf88\") on node \"crc\" DevicePath \"\"" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.990299 4952 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/63bdcada-d1ad-45eb-b290-42b2b8dd8257-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.990308 4952 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.990316 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-864dz\" (UniqueName: \"kubernetes.io/projected/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-kube-api-access-864dz\") on node \"crc\" DevicePath \"\"" Nov 22 03:06:25 crc kubenswrapper[4952]: I1122 03:06:25.990323 4952 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.134936 4952 generic.go:334] "Generic (PLEG): container finished" podID="63bdcada-d1ad-45eb-b290-42b2b8dd8257" containerID="f2edcfe1fb5697c939e663e602f25f42e12bda9d287671aa108b49f40818da8e" exitCode=0 Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.135009 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" event={"ID":"63bdcada-d1ad-45eb-b290-42b2b8dd8257","Type":"ContainerDied","Data":"f2edcfe1fb5697c939e663e602f25f42e12bda9d287671aa108b49f40818da8e"} Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.135045 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.135091 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z6kn5" event={"ID":"63bdcada-d1ad-45eb-b290-42b2b8dd8257","Type":"ContainerDied","Data":"5cff258cd5e1536d199a9b38da4c7e36d6cb69dbfd7ad6447d6e7b2d3524c968"} Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.135119 4952 scope.go:117] "RemoveContainer" containerID="f2edcfe1fb5697c939e663e602f25f42e12bda9d287671aa108b49f40818da8e" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.137952 4952 generic.go:334] "Generic (PLEG): container finished" podID="3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276" containerID="541c6f268d044648601b84df32a9f3d02ec82d32d5a0dbe03cee3f5eaa228b89" exitCode=0 Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.138006 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.138020 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" event={"ID":"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276","Type":"ContainerDied","Data":"541c6f268d044648601b84df32a9f3d02ec82d32d5a0dbe03cee3f5eaa228b89"} Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.138105 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx" event={"ID":"3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276","Type":"ContainerDied","Data":"f7cf6d585b72e1d6431f744e6c731c1821898500693a8779619ec3a8460b7e21"} Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.159728 4952 scope.go:117] "RemoveContainer" containerID="f2edcfe1fb5697c939e663e602f25f42e12bda9d287671aa108b49f40818da8e" Nov 22 03:06:26 crc kubenswrapper[4952]: E1122 03:06:26.160268 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2edcfe1fb5697c939e663e602f25f42e12bda9d287671aa108b49f40818da8e\": container with ID starting with f2edcfe1fb5697c939e663e602f25f42e12bda9d287671aa108b49f40818da8e not found: ID does not exist" containerID="f2edcfe1fb5697c939e663e602f25f42e12bda9d287671aa108b49f40818da8e" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.160327 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2edcfe1fb5697c939e663e602f25f42e12bda9d287671aa108b49f40818da8e"} err="failed to get container status \"f2edcfe1fb5697c939e663e602f25f42e12bda9d287671aa108b49f40818da8e\": rpc error: code = NotFound desc = could not find container \"f2edcfe1fb5697c939e663e602f25f42e12bda9d287671aa108b49f40818da8e\": container with ID starting with f2edcfe1fb5697c939e663e602f25f42e12bda9d287671aa108b49f40818da8e not found: ID does not exist" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.160364 4952 scope.go:117] "RemoveContainer" containerID="541c6f268d044648601b84df32a9f3d02ec82d32d5a0dbe03cee3f5eaa228b89" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.170668 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z6kn5"] Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.191105 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z6kn5"] Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.192617 4952 scope.go:117] "RemoveContainer" containerID="541c6f268d044648601b84df32a9f3d02ec82d32d5a0dbe03cee3f5eaa228b89" Nov 22 03:06:26 crc kubenswrapper[4952]: E1122 03:06:26.196131 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"541c6f268d044648601b84df32a9f3d02ec82d32d5a0dbe03cee3f5eaa228b89\": container with ID starting with 541c6f268d044648601b84df32a9f3d02ec82d32d5a0dbe03cee3f5eaa228b89 not found: ID does not exist" containerID="541c6f268d044648601b84df32a9f3d02ec82d32d5a0dbe03cee3f5eaa228b89" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.196800 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"541c6f268d044648601b84df32a9f3d02ec82d32d5a0dbe03cee3f5eaa228b89"} err="failed to get container status \"541c6f268d044648601b84df32a9f3d02ec82d32d5a0dbe03cee3f5eaa228b89\": rpc error: code = NotFound desc = could not find container \"541c6f268d044648601b84df32a9f3d02ec82d32d5a0dbe03cee3f5eaa228b89\": container with ID starting with 541c6f268d044648601b84df32a9f3d02ec82d32d5a0dbe03cee3f5eaa228b89 not found: ID does not exist" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.201422 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx"] Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.207260 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fgnfx"] Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.362576 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-676cf"] Nov 22 03:06:26 crc kubenswrapper[4952]: E1122 03:06:26.362809 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276" containerName="route-controller-manager" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.362823 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276" containerName="route-controller-manager" Nov 22 03:06:26 crc kubenswrapper[4952]: E1122 03:06:26.362843 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63bdcada-d1ad-45eb-b290-42b2b8dd8257" containerName="controller-manager" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.362849 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="63bdcada-d1ad-45eb-b290-42b2b8dd8257" containerName="controller-manager" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.362935 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="63bdcada-d1ad-45eb-b290-42b2b8dd8257" containerName="controller-manager" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.362952 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276" containerName="route-controller-manager" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.363349 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-676cf" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.365597 4952 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.365829 4952 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-glmml" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.376530 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-676cf"] Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.387521 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-p7rpg"] Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.389607 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.392330 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.392345 4952 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.495024 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-q9fcb"] Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.496726 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/59bb616f-6078-47be-a7d0-16749039f128-frr-conf\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.496806 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htnk5\" (UniqueName: \"kubernetes.io/projected/b1512c43-7cc0-4c7e-82f0-108811e38971-kube-api-access-htnk5\") pod \"frr-k8s-webhook-server-6998585d5-676cf\" (UID: \"b1512c43-7cc0-4c7e-82f0-108811e38971\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-676cf" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.496839 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jmrx\" (UniqueName: \"kubernetes.io/projected/59bb616f-6078-47be-a7d0-16749039f128-kube-api-access-2jmrx\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.496861 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/59bb616f-6078-47be-a7d0-16749039f128-metrics\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.496893 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59bb616f-6078-47be-a7d0-16749039f128-metrics-certs\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.496920 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/59bb616f-6078-47be-a7d0-16749039f128-reloader\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.496941 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/59bb616f-6078-47be-a7d0-16749039f128-frr-startup\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.496958 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/59bb616f-6078-47be-a7d0-16749039f128-frr-sockets\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.496982 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1512c43-7cc0-4c7e-82f0-108811e38971-cert\") pod \"frr-k8s-webhook-server-6998585d5-676cf\" (UID: \"b1512c43-7cc0-4c7e-82f0-108811e38971\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-676cf" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.497461 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-q9fcb" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.502415 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6c7b4b5f48-nxf9p"] Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.504308 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-nxf9p" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.504371 4952 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.504705 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.504940 4952 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-nh9cd" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.505090 4952 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.511306 4952 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.519762 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-nxf9p"] Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.540411 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276" path="/var/lib/kubelet/pods/3b7b7c10-8dd0-4a6e-bcf0-4078a1a73276/volumes" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.541119 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63bdcada-d1ad-45eb-b290-42b2b8dd8257" path="/var/lib/kubelet/pods/63bdcada-d1ad-45eb-b290-42b2b8dd8257/volumes" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.598063 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-metallb-excludel2\") pod \"speaker-q9fcb\" (UID: \"7b13d75a-b01b-4879-9ab2-1c5ffb445c38\") " pod="metallb-system/speaker-q9fcb" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.598138 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1512c43-7cc0-4c7e-82f0-108811e38971-cert\") pod \"frr-k8s-webhook-server-6998585d5-676cf\" (UID: \"b1512c43-7cc0-4c7e-82f0-108811e38971\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-676cf" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.598175 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-metrics-certs\") pod \"speaker-q9fcb\" (UID: \"7b13d75a-b01b-4879-9ab2-1c5ffb445c38\") " pod="metallb-system/speaker-q9fcb" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.598228 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-memberlist\") pod \"speaker-q9fcb\" (UID: \"7b13d75a-b01b-4879-9ab2-1c5ffb445c38\") " pod="metallb-system/speaker-q9fcb" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.598265 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj92z\" (UniqueName: \"kubernetes.io/projected/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-kube-api-access-tj92z\") pod \"speaker-q9fcb\" (UID: \"7b13d75a-b01b-4879-9ab2-1c5ffb445c38\") " pod="metallb-system/speaker-q9fcb" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.598306 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/59bb616f-6078-47be-a7d0-16749039f128-frr-conf\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.598345 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htnk5\" (UniqueName: \"kubernetes.io/projected/b1512c43-7cc0-4c7e-82f0-108811e38971-kube-api-access-htnk5\") pod \"frr-k8s-webhook-server-6998585d5-676cf\" (UID: \"b1512c43-7cc0-4c7e-82f0-108811e38971\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-676cf" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.598375 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jmrx\" (UniqueName: \"kubernetes.io/projected/59bb616f-6078-47be-a7d0-16749039f128-kube-api-access-2jmrx\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.598401 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/59bb616f-6078-47be-a7d0-16749039f128-metrics\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.598437 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59bb616f-6078-47be-a7d0-16749039f128-metrics-certs\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.598489 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/59bb616f-6078-47be-a7d0-16749039f128-reloader\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.598515 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/59bb616f-6078-47be-a7d0-16749039f128-frr-startup\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.598565 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/59bb616f-6078-47be-a7d0-16749039f128-frr-sockets\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.599231 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/59bb616f-6078-47be-a7d0-16749039f128-frr-sockets\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.599796 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/59bb616f-6078-47be-a7d0-16749039f128-reloader\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.600677 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/59bb616f-6078-47be-a7d0-16749039f128-frr-startup\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.600908 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/59bb616f-6078-47be-a7d0-16749039f128-metrics\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.601190 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/59bb616f-6078-47be-a7d0-16749039f128-frr-conf\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.604844 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59bb616f-6078-47be-a7d0-16749039f128-metrics-certs\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.620836 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1512c43-7cc0-4c7e-82f0-108811e38971-cert\") pod \"frr-k8s-webhook-server-6998585d5-676cf\" (UID: \"b1512c43-7cc0-4c7e-82f0-108811e38971\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-676cf" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.621415 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jmrx\" (UniqueName: \"kubernetes.io/projected/59bb616f-6078-47be-a7d0-16749039f128-kube-api-access-2jmrx\") pod \"frr-k8s-p7rpg\" (UID: \"59bb616f-6078-47be-a7d0-16749039f128\") " pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.624303 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htnk5\" (UniqueName: \"kubernetes.io/projected/b1512c43-7cc0-4c7e-82f0-108811e38971-kube-api-access-htnk5\") pod \"frr-k8s-webhook-server-6998585d5-676cf\" (UID: \"b1512c43-7cc0-4c7e-82f0-108811e38971\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-676cf" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.683467 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-676cf" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.699926 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-memberlist\") pod \"speaker-q9fcb\" (UID: \"7b13d75a-b01b-4879-9ab2-1c5ffb445c38\") " pod="metallb-system/speaker-q9fcb" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.700319 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97e744a4-812f-4662-bc3c-80777619b8a2-metrics-certs\") pod \"controller-6c7b4b5f48-nxf9p\" (UID: \"97e744a4-812f-4662-bc3c-80777619b8a2\") " pod="metallb-system/controller-6c7b4b5f48-nxf9p" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.700347 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj92z\" (UniqueName: \"kubernetes.io/projected/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-kube-api-access-tj92z\") pod \"speaker-q9fcb\" (UID: \"7b13d75a-b01b-4879-9ab2-1c5ffb445c38\") " pod="metallb-system/speaker-q9fcb" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.700388 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhz5s\" (UniqueName: \"kubernetes.io/projected/97e744a4-812f-4662-bc3c-80777619b8a2-kube-api-access-mhz5s\") pod \"controller-6c7b4b5f48-nxf9p\" (UID: \"97e744a4-812f-4662-bc3c-80777619b8a2\") " pod="metallb-system/controller-6c7b4b5f48-nxf9p" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.700464 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-metallb-excludel2\") pod \"speaker-q9fcb\" (UID: \"7b13d75a-b01b-4879-9ab2-1c5ffb445c38\") " pod="metallb-system/speaker-q9fcb" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.700484 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-metrics-certs\") pod \"speaker-q9fcb\" (UID: \"7b13d75a-b01b-4879-9ab2-1c5ffb445c38\") " pod="metallb-system/speaker-q9fcb" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.700558 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97e744a4-812f-4662-bc3c-80777619b8a2-cert\") pod \"controller-6c7b4b5f48-nxf9p\" (UID: \"97e744a4-812f-4662-bc3c-80777619b8a2\") " pod="metallb-system/controller-6c7b4b5f48-nxf9p" Nov 22 03:06:26 crc kubenswrapper[4952]: E1122 03:06:26.700241 4952 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 22 03:06:26 crc kubenswrapper[4952]: E1122 03:06:26.700686 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-memberlist podName:7b13d75a-b01b-4879-9ab2-1c5ffb445c38 nodeName:}" failed. No retries permitted until 2025-11-22 03:06:27.200659695 +0000 UTC m=+751.506676968 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-memberlist") pod "speaker-q9fcb" (UID: "7b13d75a-b01b-4879-9ab2-1c5ffb445c38") : secret "metallb-memberlist" not found Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.701703 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-metallb-excludel2\") pod \"speaker-q9fcb\" (UID: \"7b13d75a-b01b-4879-9ab2-1c5ffb445c38\") " pod="metallb-system/speaker-q9fcb" Nov 22 03:06:26 crc kubenswrapper[4952]: E1122 03:06:26.702085 4952 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Nov 22 03:06:26 crc kubenswrapper[4952]: E1122 03:06:26.702133 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-metrics-certs podName:7b13d75a-b01b-4879-9ab2-1c5ffb445c38 nodeName:}" failed. No retries permitted until 2025-11-22 03:06:27.202120824 +0000 UTC m=+751.508138097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-metrics-certs") pod "speaker-q9fcb" (UID: "7b13d75a-b01b-4879-9ab2-1c5ffb445c38") : secret "speaker-certs-secret" not found Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.709795 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.745145 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj92z\" (UniqueName: \"kubernetes.io/projected/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-kube-api-access-tj92z\") pod \"speaker-q9fcb\" (UID: \"7b13d75a-b01b-4879-9ab2-1c5ffb445c38\") " pod="metallb-system/speaker-q9fcb" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.815563 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97e744a4-812f-4662-bc3c-80777619b8a2-cert\") pod \"controller-6c7b4b5f48-nxf9p\" (UID: \"97e744a4-812f-4662-bc3c-80777619b8a2\") " pod="metallb-system/controller-6c7b4b5f48-nxf9p" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.815894 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97e744a4-812f-4662-bc3c-80777619b8a2-metrics-certs\") pod \"controller-6c7b4b5f48-nxf9p\" (UID: \"97e744a4-812f-4662-bc3c-80777619b8a2\") " pod="metallb-system/controller-6c7b4b5f48-nxf9p" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.816006 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhz5s\" (UniqueName: \"kubernetes.io/projected/97e744a4-812f-4662-bc3c-80777619b8a2-kube-api-access-mhz5s\") pod \"controller-6c7b4b5f48-nxf9p\" (UID: \"97e744a4-812f-4662-bc3c-80777619b8a2\") " pod="metallb-system/controller-6c7b4b5f48-nxf9p" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.819334 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97e744a4-812f-4662-bc3c-80777619b8a2-metrics-certs\") pod \"controller-6c7b4b5f48-nxf9p\" (UID: \"97e744a4-812f-4662-bc3c-80777619b8a2\") " pod="metallb-system/controller-6c7b4b5f48-nxf9p" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.824830 4952 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.833978 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97e744a4-812f-4662-bc3c-80777619b8a2-cert\") pod \"controller-6c7b4b5f48-nxf9p\" (UID: \"97e744a4-812f-4662-bc3c-80777619b8a2\") " pod="metallb-system/controller-6c7b4b5f48-nxf9p" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.849794 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhz5s\" (UniqueName: \"kubernetes.io/projected/97e744a4-812f-4662-bc3c-80777619b8a2-kube-api-access-mhz5s\") pod \"controller-6c7b4b5f48-nxf9p\" (UID: \"97e744a4-812f-4662-bc3c-80777619b8a2\") " pod="metallb-system/controller-6c7b4b5f48-nxf9p" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.928204 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx"] Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.929111 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.932300 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.932456 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.932843 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.933606 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.934188 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.950389 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 22 03:06:26 crc kubenswrapper[4952]: I1122 03:06:26.967938 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx"] Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.024314 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8zch\" (UniqueName: \"kubernetes.io/projected/96e2d508-983b-4806-9333-33389b25b876-kube-api-access-f8zch\") pod \"route-controller-manager-6c45bf5c76-pg4wx\" (UID: \"96e2d508-983b-4806-9333-33389b25b876\") " pod="openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.024863 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96e2d508-983b-4806-9333-33389b25b876-serving-cert\") pod \"route-controller-manager-6c45bf5c76-pg4wx\" (UID: \"96e2d508-983b-4806-9333-33389b25b876\") " pod="openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.024938 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96e2d508-983b-4806-9333-33389b25b876-client-ca\") pod \"route-controller-manager-6c45bf5c76-pg4wx\" (UID: \"96e2d508-983b-4806-9333-33389b25b876\") " pod="openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.024969 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96e2d508-983b-4806-9333-33389b25b876-config\") pod \"route-controller-manager-6c45bf5c76-pg4wx\" (UID: \"96e2d508-983b-4806-9333-33389b25b876\") " pod="openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.125471 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96e2d508-983b-4806-9333-33389b25b876-client-ca\") pod \"route-controller-manager-6c45bf5c76-pg4wx\" (UID: \"96e2d508-983b-4806-9333-33389b25b876\") " pod="openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.125535 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96e2d508-983b-4806-9333-33389b25b876-config\") pod \"route-controller-manager-6c45bf5c76-pg4wx\" (UID: \"96e2d508-983b-4806-9333-33389b25b876\") " pod="openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.125631 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8zch\" (UniqueName: \"kubernetes.io/projected/96e2d508-983b-4806-9333-33389b25b876-kube-api-access-f8zch\") pod \"route-controller-manager-6c45bf5c76-pg4wx\" (UID: \"96e2d508-983b-4806-9333-33389b25b876\") " pod="openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.125669 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96e2d508-983b-4806-9333-33389b25b876-serving-cert\") pod \"route-controller-manager-6c45bf5c76-pg4wx\" (UID: \"96e2d508-983b-4806-9333-33389b25b876\") " pod="openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.126517 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96e2d508-983b-4806-9333-33389b25b876-client-ca\") pod \"route-controller-manager-6c45bf5c76-pg4wx\" (UID: \"96e2d508-983b-4806-9333-33389b25b876\") " pod="openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.126960 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96e2d508-983b-4806-9333-33389b25b876-config\") pod \"route-controller-manager-6c45bf5c76-pg4wx\" (UID: \"96e2d508-983b-4806-9333-33389b25b876\") " pod="openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.128795 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-nxf9p" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.131574 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96e2d508-983b-4806-9333-33389b25b876-serving-cert\") pod \"route-controller-manager-6c45bf5c76-pg4wx\" (UID: \"96e2d508-983b-4806-9333-33389b25b876\") " pod="openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.146729 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8zch\" (UniqueName: \"kubernetes.io/projected/96e2d508-983b-4806-9333-33389b25b876-kube-api-access-f8zch\") pod \"route-controller-manager-6c45bf5c76-pg4wx\" (UID: \"96e2d508-983b-4806-9333-33389b25b876\") " pod="openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.160346 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p7rpg" event={"ID":"59bb616f-6078-47be-a7d0-16749039f128","Type":"ContainerStarted","Data":"f5371a9cdfd36bff70ce0741a17c06da82e098ef33b77948a84aa358f39cfff8"} Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.196991 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-55b468cbf8-w6lh9"] Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.198138 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.200608 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.203617 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55b468cbf8-w6lh9"] Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.205489 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.205804 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.206089 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.206276 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.206477 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.214097 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.227162 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsrx8\" (UniqueName: \"kubernetes.io/projected/c515daeb-7531-4fbe-8e98-fcdf029e167b-kube-api-access-tsrx8\") pod \"controller-manager-55b468cbf8-w6lh9\" (UID: \"c515daeb-7531-4fbe-8e98-fcdf029e167b\") " pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.227221 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c515daeb-7531-4fbe-8e98-fcdf029e167b-client-ca\") pod \"controller-manager-55b468cbf8-w6lh9\" (UID: \"c515daeb-7531-4fbe-8e98-fcdf029e167b\") " pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.227261 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-metrics-certs\") pod \"speaker-q9fcb\" (UID: \"7b13d75a-b01b-4879-9ab2-1c5ffb445c38\") " pod="metallb-system/speaker-q9fcb" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.227293 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c515daeb-7531-4fbe-8e98-fcdf029e167b-proxy-ca-bundles\") pod \"controller-manager-55b468cbf8-w6lh9\" (UID: \"c515daeb-7531-4fbe-8e98-fcdf029e167b\") " pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.227321 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-memberlist\") pod \"speaker-q9fcb\" (UID: \"7b13d75a-b01b-4879-9ab2-1c5ffb445c38\") " pod="metallb-system/speaker-q9fcb" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.227338 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c515daeb-7531-4fbe-8e98-fcdf029e167b-serving-cert\") pod \"controller-manager-55b468cbf8-w6lh9\" (UID: \"c515daeb-7531-4fbe-8e98-fcdf029e167b\") " pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.227376 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c515daeb-7531-4fbe-8e98-fcdf029e167b-config\") pod \"controller-manager-55b468cbf8-w6lh9\" (UID: \"c515daeb-7531-4fbe-8e98-fcdf029e167b\") " pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" Nov 22 03:06:27 crc kubenswrapper[4952]: E1122 03:06:27.228398 4952 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 22 03:06:27 crc kubenswrapper[4952]: E1122 03:06:27.228503 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-memberlist podName:7b13d75a-b01b-4879-9ab2-1c5ffb445c38 nodeName:}" failed. No retries permitted until 2025-11-22 03:06:28.228476446 +0000 UTC m=+752.534493729 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-memberlist") pod "speaker-q9fcb" (UID: "7b13d75a-b01b-4879-9ab2-1c5ffb445c38") : secret "metallb-memberlist" not found Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.233951 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-metrics-certs\") pod \"speaker-q9fcb\" (UID: \"7b13d75a-b01b-4879-9ab2-1c5ffb445c38\") " pod="metallb-system/speaker-q9fcb" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.250051 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.328738 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsrx8\" (UniqueName: \"kubernetes.io/projected/c515daeb-7531-4fbe-8e98-fcdf029e167b-kube-api-access-tsrx8\") pod \"controller-manager-55b468cbf8-w6lh9\" (UID: \"c515daeb-7531-4fbe-8e98-fcdf029e167b\") " pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.329078 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c515daeb-7531-4fbe-8e98-fcdf029e167b-client-ca\") pod \"controller-manager-55b468cbf8-w6lh9\" (UID: \"c515daeb-7531-4fbe-8e98-fcdf029e167b\") " pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.329110 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c515daeb-7531-4fbe-8e98-fcdf029e167b-proxy-ca-bundles\") pod \"controller-manager-55b468cbf8-w6lh9\" (UID: \"c515daeb-7531-4fbe-8e98-fcdf029e167b\") " pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.329149 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c515daeb-7531-4fbe-8e98-fcdf029e167b-serving-cert\") pod \"controller-manager-55b468cbf8-w6lh9\" (UID: \"c515daeb-7531-4fbe-8e98-fcdf029e167b\") " pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.329187 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c515daeb-7531-4fbe-8e98-fcdf029e167b-config\") pod \"controller-manager-55b468cbf8-w6lh9\" (UID: \"c515daeb-7531-4fbe-8e98-fcdf029e167b\") " pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.330656 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c515daeb-7531-4fbe-8e98-fcdf029e167b-config\") pod \"controller-manager-55b468cbf8-w6lh9\" (UID: \"c515daeb-7531-4fbe-8e98-fcdf029e167b\") " pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.330954 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c515daeb-7531-4fbe-8e98-fcdf029e167b-proxy-ca-bundles\") pod \"controller-manager-55b468cbf8-w6lh9\" (UID: \"c515daeb-7531-4fbe-8e98-fcdf029e167b\") " pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.332308 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c515daeb-7531-4fbe-8e98-fcdf029e167b-client-ca\") pod \"controller-manager-55b468cbf8-w6lh9\" (UID: \"c515daeb-7531-4fbe-8e98-fcdf029e167b\") " pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.333964 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c515daeb-7531-4fbe-8e98-fcdf029e167b-serving-cert\") pod \"controller-manager-55b468cbf8-w6lh9\" (UID: \"c515daeb-7531-4fbe-8e98-fcdf029e167b\") " pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.350388 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsrx8\" (UniqueName: \"kubernetes.io/projected/c515daeb-7531-4fbe-8e98-fcdf029e167b-kube-api-access-tsrx8\") pod \"controller-manager-55b468cbf8-w6lh9\" (UID: \"c515daeb-7531-4fbe-8e98-fcdf029e167b\") " pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.407991 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-676cf"] Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.501332 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-nxf9p"] Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.528232 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.564692 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx"] Nov 22 03:06:27 crc kubenswrapper[4952]: W1122 03:06:27.571891 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96e2d508_983b_4806_9333_33389b25b876.slice/crio-98833df482be4a0aa0330dd48098237edf8e6cf9d98ec1959efdd8184ec1f514 WatchSource:0}: Error finding container 98833df482be4a0aa0330dd48098237edf8e6cf9d98ec1959efdd8184ec1f514: Status 404 returned error can't find the container with id 98833df482be4a0aa0330dd48098237edf8e6cf9d98ec1959efdd8184ec1f514 Nov 22 03:06:27 crc kubenswrapper[4952]: I1122 03:06:27.766108 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55b468cbf8-w6lh9"] Nov 22 03:06:27 crc kubenswrapper[4952]: W1122 03:06:27.776744 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc515daeb_7531_4fbe_8e98_fcdf029e167b.slice/crio-7cdfd13c39ec55f647059454346c17f6e0485188e2baf70c5053a54617605e53 WatchSource:0}: Error finding container 7cdfd13c39ec55f647059454346c17f6e0485188e2baf70c5053a54617605e53: Status 404 returned error can't find the container with id 7cdfd13c39ec55f647059454346c17f6e0485188e2baf70c5053a54617605e53 Nov 22 03:06:28 crc kubenswrapper[4952]: I1122 03:06:28.170377 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx" event={"ID":"96e2d508-983b-4806-9333-33389b25b876","Type":"ContainerStarted","Data":"80f1d73f165efb923e394876654769633087bd288b4d59a86fc9ca4c2fb94a47"} Nov 22 03:06:28 crc kubenswrapper[4952]: I1122 03:06:28.170971 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx" event={"ID":"96e2d508-983b-4806-9333-33389b25b876","Type":"ContainerStarted","Data":"98833df482be4a0aa0330dd48098237edf8e6cf9d98ec1959efdd8184ec1f514"} Nov 22 03:06:28 crc kubenswrapper[4952]: I1122 03:06:28.171020 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx" Nov 22 03:06:28 crc kubenswrapper[4952]: I1122 03:06:28.173318 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" event={"ID":"c515daeb-7531-4fbe-8e98-fcdf029e167b","Type":"ContainerStarted","Data":"c9fd9b4b662482fd3d0b91b9210590900f0980b2225b8333f7aab49cf5709b64"} Nov 22 03:06:28 crc kubenswrapper[4952]: I1122 03:06:28.173478 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" event={"ID":"c515daeb-7531-4fbe-8e98-fcdf029e167b","Type":"ContainerStarted","Data":"7cdfd13c39ec55f647059454346c17f6e0485188e2baf70c5053a54617605e53"} Nov 22 03:06:28 crc kubenswrapper[4952]: I1122 03:06:28.173598 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" Nov 22 03:06:28 crc kubenswrapper[4952]: I1122 03:06:28.175689 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-nxf9p" event={"ID":"97e744a4-812f-4662-bc3c-80777619b8a2","Type":"ContainerStarted","Data":"c9086362f64a5d8026d110fa1129506bbaac318a775b064b34b813a49a6fa12b"} Nov 22 03:06:28 crc kubenswrapper[4952]: I1122 03:06:28.175760 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-nxf9p" event={"ID":"97e744a4-812f-4662-bc3c-80777619b8a2","Type":"ContainerStarted","Data":"f03fc6551f077566945536b3925cc4325876e68f7743bad4abc1486919a1eab0"} Nov 22 03:06:28 crc kubenswrapper[4952]: I1122 03:06:28.175776 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-nxf9p" event={"ID":"97e744a4-812f-4662-bc3c-80777619b8a2","Type":"ContainerStarted","Data":"f81c371bdc6faa1534668089976b4f9563333eacb61b73d2706617eb5ebbff33"} Nov 22 03:06:28 crc kubenswrapper[4952]: I1122 03:06:28.175812 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6c7b4b5f48-nxf9p" Nov 22 03:06:28 crc kubenswrapper[4952]: I1122 03:06:28.177793 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-676cf" event={"ID":"b1512c43-7cc0-4c7e-82f0-108811e38971","Type":"ContainerStarted","Data":"3fadef24ef558c8d5b6c66f8fbf12ef6472d62307559f1e9a7a2af44c65f4a7b"} Nov 22 03:06:28 crc kubenswrapper[4952]: I1122 03:06:28.178765 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx" Nov 22 03:06:28 crc kubenswrapper[4952]: I1122 03:06:28.190766 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" Nov 22 03:06:28 crc kubenswrapper[4952]: I1122 03:06:28.203716 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c45bf5c76-pg4wx" podStartSLOduration=2.203682795 podStartE2EDuration="2.203682795s" podCreationTimestamp="2025-11-22 03:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:06:28.19936403 +0000 UTC m=+752.505381303" watchObservedRunningTime="2025-11-22 03:06:28.203682795 +0000 UTC m=+752.509700088" Nov 22 03:06:28 crc kubenswrapper[4952]: I1122 03:06:28.241030 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-memberlist\") pod \"speaker-q9fcb\" (UID: \"7b13d75a-b01b-4879-9ab2-1c5ffb445c38\") " pod="metallb-system/speaker-q9fcb" Nov 22 03:06:28 crc kubenswrapper[4952]: I1122 03:06:28.249218 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7b13d75a-b01b-4879-9ab2-1c5ffb445c38-memberlist\") pod \"speaker-q9fcb\" (UID: \"7b13d75a-b01b-4879-9ab2-1c5ffb445c38\") " pod="metallb-system/speaker-q9fcb" Nov 22 03:06:28 crc kubenswrapper[4952]: I1122 03:06:28.256968 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6c7b4b5f48-nxf9p" podStartSLOduration=2.256942633 podStartE2EDuration="2.256942633s" podCreationTimestamp="2025-11-22 03:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:06:28.229157703 +0000 UTC m=+752.535174986" watchObservedRunningTime="2025-11-22 03:06:28.256942633 +0000 UTC m=+752.562959906" Nov 22 03:06:28 crc kubenswrapper[4952]: I1122 03:06:28.320597 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-q9fcb" Nov 22 03:06:28 crc kubenswrapper[4952]: I1122 03:06:28.342077 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:06:28 crc kubenswrapper[4952]: I1122 03:06:28.342149 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:06:28 crc kubenswrapper[4952]: W1122 03:06:28.347637 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b13d75a_b01b_4879_9ab2_1c5ffb445c38.slice/crio-9216d95255b773875b99d2bab7a2d8cc1757b999b39101f632bf02e86a34a435 WatchSource:0}: Error finding container 9216d95255b773875b99d2bab7a2d8cc1757b999b39101f632bf02e86a34a435: Status 404 returned error can't find the container with id 9216d95255b773875b99d2bab7a2d8cc1757b999b39101f632bf02e86a34a435 Nov 22 03:06:29 crc kubenswrapper[4952]: I1122 03:06:29.210307 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-q9fcb" event={"ID":"7b13d75a-b01b-4879-9ab2-1c5ffb445c38","Type":"ContainerStarted","Data":"7193174deabda1855efb7b50ce792abbb5ba9ee3f9ee52c513ecc1dbb47e0cab"} Nov 22 03:06:29 crc kubenswrapper[4952]: I1122 03:06:29.210771 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-q9fcb" event={"ID":"7b13d75a-b01b-4879-9ab2-1c5ffb445c38","Type":"ContainerStarted","Data":"549cd524514f57681c5dfea2da68a1732c7bd05cd4a1133ff05bc5bad330ea67"} Nov 22 03:06:29 crc kubenswrapper[4952]: I1122 03:06:29.210784 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-q9fcb" event={"ID":"7b13d75a-b01b-4879-9ab2-1c5ffb445c38","Type":"ContainerStarted","Data":"9216d95255b773875b99d2bab7a2d8cc1757b999b39101f632bf02e86a34a435"} Nov 22 03:06:29 crc kubenswrapper[4952]: I1122 03:06:29.211117 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-q9fcb" Nov 22 03:06:29 crc kubenswrapper[4952]: I1122 03:06:29.231192 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-55b468cbf8-w6lh9" podStartSLOduration=4.231170307 podStartE2EDuration="4.231170307s" podCreationTimestamp="2025-11-22 03:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:06:28.299949548 +0000 UTC m=+752.605966821" watchObservedRunningTime="2025-11-22 03:06:29.231170307 +0000 UTC m=+753.537187580" Nov 22 03:06:32 crc kubenswrapper[4952]: I1122 03:06:32.004997 4952 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 22 03:06:36 crc kubenswrapper[4952]: I1122 03:06:36.262848 4952 generic.go:334] "Generic (PLEG): container finished" podID="59bb616f-6078-47be-a7d0-16749039f128" containerID="4b93e732f527a1a6161aecc1464e27bc39a56cba9c7048b4db24a486d7203011" exitCode=0 Nov 22 03:06:36 crc kubenswrapper[4952]: I1122 03:06:36.263741 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p7rpg" event={"ID":"59bb616f-6078-47be-a7d0-16749039f128","Type":"ContainerDied","Data":"4b93e732f527a1a6161aecc1464e27bc39a56cba9c7048b4db24a486d7203011"} Nov 22 03:06:36 crc kubenswrapper[4952]: I1122 03:06:36.271172 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-676cf" event={"ID":"b1512c43-7cc0-4c7e-82f0-108811e38971","Type":"ContainerStarted","Data":"b475f58cf0d7f4faea2daa9b34f88c7799e790dcadacd88b3be73dba022cd802"} Nov 22 03:06:36 crc kubenswrapper[4952]: I1122 03:06:36.271416 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-6998585d5-676cf" Nov 22 03:06:36 crc kubenswrapper[4952]: I1122 03:06:36.321898 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-q9fcb" podStartSLOduration=10.321874251 podStartE2EDuration="10.321874251s" podCreationTimestamp="2025-11-22 03:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:06:29.235790109 +0000 UTC m=+753.541807402" watchObservedRunningTime="2025-11-22 03:06:36.321874251 +0000 UTC m=+760.627891524" Nov 22 03:06:36 crc kubenswrapper[4952]: I1122 03:06:36.337777 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-6998585d5-676cf" podStartSLOduration=2.333635325 podStartE2EDuration="10.337753224s" podCreationTimestamp="2025-11-22 03:06:26 +0000 UTC" firstStartedPulling="2025-11-22 03:06:27.444891626 +0000 UTC m=+751.750908899" lastFinishedPulling="2025-11-22 03:06:35.449009525 +0000 UTC m=+759.755026798" observedRunningTime="2025-11-22 03:06:36.333558022 +0000 UTC m=+760.639575305" watchObservedRunningTime="2025-11-22 03:06:36.337753224 +0000 UTC m=+760.643770507" Nov 22 03:06:37 crc kubenswrapper[4952]: I1122 03:06:37.136974 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6c7b4b5f48-nxf9p" Nov 22 03:06:37 crc kubenswrapper[4952]: I1122 03:06:37.279721 4952 generic.go:334] "Generic (PLEG): container finished" podID="59bb616f-6078-47be-a7d0-16749039f128" containerID="4738ef95ac8a4c00104a9ea4c9587aa024579ed2878ed6babed303f35075b079" exitCode=0 Nov 22 03:06:37 crc kubenswrapper[4952]: I1122 03:06:37.279800 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p7rpg" event={"ID":"59bb616f-6078-47be-a7d0-16749039f128","Type":"ContainerDied","Data":"4738ef95ac8a4c00104a9ea4c9587aa024579ed2878ed6babed303f35075b079"} Nov 22 03:06:38 crc kubenswrapper[4952]: I1122 03:06:38.290441 4952 generic.go:334] "Generic (PLEG): container finished" podID="59bb616f-6078-47be-a7d0-16749039f128" containerID="b47c123acfa3d70672351413c6831a470d4f9b7b1453b54af09c6d4aa6578c3c" exitCode=0 Nov 22 03:06:38 crc kubenswrapper[4952]: I1122 03:06:38.290565 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p7rpg" event={"ID":"59bb616f-6078-47be-a7d0-16749039f128","Type":"ContainerDied","Data":"b47c123acfa3d70672351413c6831a470d4f9b7b1453b54af09c6d4aa6578c3c"} Nov 22 03:06:38 crc kubenswrapper[4952]: I1122 03:06:38.328533 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-q9fcb" Nov 22 03:06:39 crc kubenswrapper[4952]: I1122 03:06:39.303557 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p7rpg" event={"ID":"59bb616f-6078-47be-a7d0-16749039f128","Type":"ContainerStarted","Data":"e926da20c6f4abe9e03e5e839adb964e42b30cf575cffdb69db9ab43fcd5a3a1"} Nov 22 03:06:39 crc kubenswrapper[4952]: I1122 03:06:39.304049 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p7rpg" event={"ID":"59bb616f-6078-47be-a7d0-16749039f128","Type":"ContainerStarted","Data":"f75433a9c1a5ba9e9af169b5a97201cab92fe3266420fbdc061940acc1d538a3"} Nov 22 03:06:39 crc kubenswrapper[4952]: I1122 03:06:39.304062 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p7rpg" event={"ID":"59bb616f-6078-47be-a7d0-16749039f128","Type":"ContainerStarted","Data":"cb124dcdaebe8d6bbf66e14882f2211eb22e1c49e773dfe481617ebf9a21fb0e"} Nov 22 03:06:39 crc kubenswrapper[4952]: I1122 03:06:39.304073 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p7rpg" event={"ID":"59bb616f-6078-47be-a7d0-16749039f128","Type":"ContainerStarted","Data":"55ac0a6887ed2f6448d0dc933bb0fc875707a08f1f0f81feb59253dea44fb51e"} Nov 22 03:06:39 crc kubenswrapper[4952]: I1122 03:06:39.304084 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p7rpg" event={"ID":"59bb616f-6078-47be-a7d0-16749039f128","Type":"ContainerStarted","Data":"c7de560e4e52b36cdb8727b7f1db2c051cf6cd6850c102b60fb0512d070d81e1"} Nov 22 03:06:40 crc kubenswrapper[4952]: I1122 03:06:40.316850 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p7rpg" event={"ID":"59bb616f-6078-47be-a7d0-16749039f128","Type":"ContainerStarted","Data":"e33ed6a18c100438e27aafb9319560a43ea0aae1b567a4f65a34551a18e18b58"} Nov 22 03:06:40 crc kubenswrapper[4952]: I1122 03:06:40.317138 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:40 crc kubenswrapper[4952]: I1122 03:06:40.357771 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-p7rpg" podStartSLOduration=5.856363432 podStartE2EDuration="14.357740207s" podCreationTimestamp="2025-11-22 03:06:26 +0000 UTC" firstStartedPulling="2025-11-22 03:06:26.966613695 +0000 UTC m=+751.272630968" lastFinishedPulling="2025-11-22 03:06:35.46799047 +0000 UTC m=+759.774007743" observedRunningTime="2025-11-22 03:06:40.349588499 +0000 UTC m=+764.655605782" watchObservedRunningTime="2025-11-22 03:06:40.357740207 +0000 UTC m=+764.663757520" Nov 22 03:06:41 crc kubenswrapper[4952]: I1122 03:06:41.710255 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:41 crc kubenswrapper[4952]: I1122 03:06:41.776988 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:41 crc kubenswrapper[4952]: I1122 03:06:41.857853 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jcvh9"] Nov 22 03:06:41 crc kubenswrapper[4952]: I1122 03:06:41.859007 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jcvh9" Nov 22 03:06:41 crc kubenswrapper[4952]: I1122 03:06:41.865157 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 22 03:06:41 crc kubenswrapper[4952]: I1122 03:06:41.884610 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 22 03:06:41 crc kubenswrapper[4952]: I1122 03:06:41.918179 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jcvh9"] Nov 22 03:06:41 crc kubenswrapper[4952]: I1122 03:06:41.989867 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpwsq\" (UniqueName: \"kubernetes.io/projected/87fc54ad-677d-4e2e-a0b3-35c46d3ca038-kube-api-access-rpwsq\") pod \"openstack-operator-index-jcvh9\" (UID: \"87fc54ad-677d-4e2e-a0b3-35c46d3ca038\") " pod="openstack-operators/openstack-operator-index-jcvh9" Nov 22 03:06:42 crc kubenswrapper[4952]: I1122 03:06:42.091066 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpwsq\" (UniqueName: \"kubernetes.io/projected/87fc54ad-677d-4e2e-a0b3-35c46d3ca038-kube-api-access-rpwsq\") pod \"openstack-operator-index-jcvh9\" (UID: \"87fc54ad-677d-4e2e-a0b3-35c46d3ca038\") " pod="openstack-operators/openstack-operator-index-jcvh9" Nov 22 03:06:42 crc kubenswrapper[4952]: I1122 03:06:42.117357 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpwsq\" (UniqueName: \"kubernetes.io/projected/87fc54ad-677d-4e2e-a0b3-35c46d3ca038-kube-api-access-rpwsq\") pod \"openstack-operator-index-jcvh9\" (UID: \"87fc54ad-677d-4e2e-a0b3-35c46d3ca038\") " pod="openstack-operators/openstack-operator-index-jcvh9" Nov 22 03:06:42 crc kubenswrapper[4952]: I1122 03:06:42.179115 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jcvh9" Nov 22 03:06:42 crc kubenswrapper[4952]: I1122 03:06:42.642754 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jcvh9"] Nov 22 03:06:42 crc kubenswrapper[4952]: W1122 03:06:42.650926 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87fc54ad_677d_4e2e_a0b3_35c46d3ca038.slice/crio-799fbdc1c0a99ad62b7bf49b64e6a369c99261b497c69cd9a44ce7a2fecc14a7 WatchSource:0}: Error finding container 799fbdc1c0a99ad62b7bf49b64e6a369c99261b497c69cd9a44ce7a2fecc14a7: Status 404 returned error can't find the container with id 799fbdc1c0a99ad62b7bf49b64e6a369c99261b497c69cd9a44ce7a2fecc14a7 Nov 22 03:06:43 crc kubenswrapper[4952]: I1122 03:06:43.350063 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jcvh9" event={"ID":"87fc54ad-677d-4e2e-a0b3-35c46d3ca038","Type":"ContainerStarted","Data":"799fbdc1c0a99ad62b7bf49b64e6a369c99261b497c69cd9a44ce7a2fecc14a7"} Nov 22 03:06:44 crc kubenswrapper[4952]: I1122 03:06:44.608223 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jcvh9"] Nov 22 03:06:45 crc kubenswrapper[4952]: I1122 03:06:45.218978 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-xbtqc"] Nov 22 03:06:45 crc kubenswrapper[4952]: I1122 03:06:45.220009 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xbtqc" Nov 22 03:06:45 crc kubenswrapper[4952]: I1122 03:06:45.225454 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-mps4h" Nov 22 03:06:45 crc kubenswrapper[4952]: I1122 03:06:45.244181 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xbtqc"] Nov 22 03:06:45 crc kubenswrapper[4952]: I1122 03:06:45.346538 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmrgz\" (UniqueName: \"kubernetes.io/projected/37332314-a4d3-4c04-a480-561c80a2fa8a-kube-api-access-nmrgz\") pod \"openstack-operator-index-xbtqc\" (UID: \"37332314-a4d3-4c04-a480-561c80a2fa8a\") " pod="openstack-operators/openstack-operator-index-xbtqc" Nov 22 03:06:45 crc kubenswrapper[4952]: I1122 03:06:45.449050 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmrgz\" (UniqueName: \"kubernetes.io/projected/37332314-a4d3-4c04-a480-561c80a2fa8a-kube-api-access-nmrgz\") pod \"openstack-operator-index-xbtqc\" (UID: \"37332314-a4d3-4c04-a480-561c80a2fa8a\") " pod="openstack-operators/openstack-operator-index-xbtqc" Nov 22 03:06:45 crc kubenswrapper[4952]: I1122 03:06:45.479203 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmrgz\" (UniqueName: \"kubernetes.io/projected/37332314-a4d3-4c04-a480-561c80a2fa8a-kube-api-access-nmrgz\") pod \"openstack-operator-index-xbtqc\" (UID: \"37332314-a4d3-4c04-a480-561c80a2fa8a\") " pod="openstack-operators/openstack-operator-index-xbtqc" Nov 22 03:06:45 crc kubenswrapper[4952]: I1122 03:06:45.554076 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xbtqc" Nov 22 03:06:46 crc kubenswrapper[4952]: I1122 03:06:46.373454 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jcvh9" event={"ID":"87fc54ad-677d-4e2e-a0b3-35c46d3ca038","Type":"ContainerStarted","Data":"ad7945489a725804baae0f8d5df28958590c7b662d7dbd8dd8eaa6bda66a9275"} Nov 22 03:06:46 crc kubenswrapper[4952]: I1122 03:06:46.373672 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-jcvh9" podUID="87fc54ad-677d-4e2e-a0b3-35c46d3ca038" containerName="registry-server" containerID="cri-o://ad7945489a725804baae0f8d5df28958590c7b662d7dbd8dd8eaa6bda66a9275" gracePeriod=2 Nov 22 03:06:46 crc kubenswrapper[4952]: I1122 03:06:46.400785 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-jcvh9" podStartSLOduration=1.860944172 podStartE2EDuration="5.40070421s" podCreationTimestamp="2025-11-22 03:06:41 +0000 UTC" firstStartedPulling="2025-11-22 03:06:42.653975172 +0000 UTC m=+766.959992475" lastFinishedPulling="2025-11-22 03:06:46.1937352 +0000 UTC m=+770.499752513" observedRunningTime="2025-11-22 03:06:46.394150835 +0000 UTC m=+770.700168128" watchObservedRunningTime="2025-11-22 03:06:46.40070421 +0000 UTC m=+770.706721543" Nov 22 03:06:46 crc kubenswrapper[4952]: I1122 03:06:46.528848 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xbtqc"] Nov 22 03:06:46 crc kubenswrapper[4952]: W1122 03:06:46.539175 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37332314_a4d3_4c04_a480_561c80a2fa8a.slice/crio-b5c468ad9458011516d62446082547de1a1f3673ab5b49434a116f608edefe74 WatchSource:0}: Error finding container b5c468ad9458011516d62446082547de1a1f3673ab5b49434a116f608edefe74: Status 404 returned error can't find the container with id b5c468ad9458011516d62446082547de1a1f3673ab5b49434a116f608edefe74 Nov 22 03:06:46 crc kubenswrapper[4952]: I1122 03:06:46.691755 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-6998585d5-676cf" Nov 22 03:06:46 crc kubenswrapper[4952]: I1122 03:06:46.845041 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-jcvh9_87fc54ad-677d-4e2e-a0b3-35c46d3ca038/registry-server/0.log" Nov 22 03:06:46 crc kubenswrapper[4952]: I1122 03:06:46.845124 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jcvh9" Nov 22 03:06:46 crc kubenswrapper[4952]: I1122 03:06:46.973256 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpwsq\" (UniqueName: \"kubernetes.io/projected/87fc54ad-677d-4e2e-a0b3-35c46d3ca038-kube-api-access-rpwsq\") pod \"87fc54ad-677d-4e2e-a0b3-35c46d3ca038\" (UID: \"87fc54ad-677d-4e2e-a0b3-35c46d3ca038\") " Nov 22 03:06:46 crc kubenswrapper[4952]: I1122 03:06:46.978299 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87fc54ad-677d-4e2e-a0b3-35c46d3ca038-kube-api-access-rpwsq" (OuterVolumeSpecName: "kube-api-access-rpwsq") pod "87fc54ad-677d-4e2e-a0b3-35c46d3ca038" (UID: "87fc54ad-677d-4e2e-a0b3-35c46d3ca038"). InnerVolumeSpecName "kube-api-access-rpwsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:06:47 crc kubenswrapper[4952]: I1122 03:06:47.075712 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpwsq\" (UniqueName: \"kubernetes.io/projected/87fc54ad-677d-4e2e-a0b3-35c46d3ca038-kube-api-access-rpwsq\") on node \"crc\" DevicePath \"\"" Nov 22 03:06:47 crc kubenswrapper[4952]: I1122 03:06:47.381773 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xbtqc" event={"ID":"37332314-a4d3-4c04-a480-561c80a2fa8a","Type":"ContainerStarted","Data":"fcb520639a3e28dddaf968d739417d3b9bf783f72f3ddf0f850d47bf1fab4c72"} Nov 22 03:06:47 crc kubenswrapper[4952]: I1122 03:06:47.381847 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xbtqc" event={"ID":"37332314-a4d3-4c04-a480-561c80a2fa8a","Type":"ContainerStarted","Data":"b5c468ad9458011516d62446082547de1a1f3673ab5b49434a116f608edefe74"} Nov 22 03:06:47 crc kubenswrapper[4952]: I1122 03:06:47.386253 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-jcvh9_87fc54ad-677d-4e2e-a0b3-35c46d3ca038/registry-server/0.log" Nov 22 03:06:47 crc kubenswrapper[4952]: I1122 03:06:47.386821 4952 generic.go:334] "Generic (PLEG): container finished" podID="87fc54ad-677d-4e2e-a0b3-35c46d3ca038" containerID="ad7945489a725804baae0f8d5df28958590c7b662d7dbd8dd8eaa6bda66a9275" exitCode=2 Nov 22 03:06:47 crc kubenswrapper[4952]: I1122 03:06:47.386869 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jcvh9" event={"ID":"87fc54ad-677d-4e2e-a0b3-35c46d3ca038","Type":"ContainerDied","Data":"ad7945489a725804baae0f8d5df28958590c7b662d7dbd8dd8eaa6bda66a9275"} Nov 22 03:06:47 crc kubenswrapper[4952]: I1122 03:06:47.386909 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jcvh9" event={"ID":"87fc54ad-677d-4e2e-a0b3-35c46d3ca038","Type":"ContainerDied","Data":"799fbdc1c0a99ad62b7bf49b64e6a369c99261b497c69cd9a44ce7a2fecc14a7"} Nov 22 03:06:47 crc kubenswrapper[4952]: I1122 03:06:47.386939 4952 scope.go:117] "RemoveContainer" containerID="ad7945489a725804baae0f8d5df28958590c7b662d7dbd8dd8eaa6bda66a9275" Nov 22 03:06:47 crc kubenswrapper[4952]: I1122 03:06:47.387051 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jcvh9" Nov 22 03:06:47 crc kubenswrapper[4952]: I1122 03:06:47.405110 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-xbtqc" podStartSLOduration=2.311982978 podStartE2EDuration="2.405065296s" podCreationTimestamp="2025-11-22 03:06:45 +0000 UTC" firstStartedPulling="2025-11-22 03:06:46.545886404 +0000 UTC m=+770.851903677" lastFinishedPulling="2025-11-22 03:06:46.638968682 +0000 UTC m=+770.944985995" observedRunningTime="2025-11-22 03:06:47.402537448 +0000 UTC m=+771.708554721" watchObservedRunningTime="2025-11-22 03:06:47.405065296 +0000 UTC m=+771.711082569" Nov 22 03:06:47 crc kubenswrapper[4952]: I1122 03:06:47.416697 4952 scope.go:117] "RemoveContainer" containerID="ad7945489a725804baae0f8d5df28958590c7b662d7dbd8dd8eaa6bda66a9275" Nov 22 03:06:47 crc kubenswrapper[4952]: E1122 03:06:47.417427 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad7945489a725804baae0f8d5df28958590c7b662d7dbd8dd8eaa6bda66a9275\": container with ID starting with ad7945489a725804baae0f8d5df28958590c7b662d7dbd8dd8eaa6bda66a9275 not found: ID does not exist" containerID="ad7945489a725804baae0f8d5df28958590c7b662d7dbd8dd8eaa6bda66a9275" Nov 22 03:06:47 crc kubenswrapper[4952]: I1122 03:06:47.417484 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad7945489a725804baae0f8d5df28958590c7b662d7dbd8dd8eaa6bda66a9275"} err="failed to get container status \"ad7945489a725804baae0f8d5df28958590c7b662d7dbd8dd8eaa6bda66a9275\": rpc error: code = NotFound desc = could not find container \"ad7945489a725804baae0f8d5df28958590c7b662d7dbd8dd8eaa6bda66a9275\": container with ID starting with ad7945489a725804baae0f8d5df28958590c7b662d7dbd8dd8eaa6bda66a9275 not found: ID does not exist" Nov 22 03:06:47 crc kubenswrapper[4952]: I1122 03:06:47.444446 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jcvh9"] Nov 22 03:06:47 crc kubenswrapper[4952]: I1122 03:06:47.452162 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-jcvh9"] Nov 22 03:06:48 crc kubenswrapper[4952]: I1122 03:06:48.539677 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87fc54ad-677d-4e2e-a0b3-35c46d3ca038" path="/var/lib/kubelet/pods/87fc54ad-677d-4e2e-a0b3-35c46d3ca038/volumes" Nov 22 03:06:55 crc kubenswrapper[4952]: I1122 03:06:55.555139 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-xbtqc" Nov 22 03:06:55 crc kubenswrapper[4952]: I1122 03:06:55.556117 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-xbtqc" Nov 22 03:06:55 crc kubenswrapper[4952]: I1122 03:06:55.587729 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-xbtqc" Nov 22 03:06:56 crc kubenswrapper[4952]: I1122 03:06:56.520637 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-xbtqc" Nov 22 03:06:56 crc kubenswrapper[4952]: I1122 03:06:56.715446 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-p7rpg" Nov 22 03:06:58 crc kubenswrapper[4952]: I1122 03:06:58.342409 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:06:58 crc kubenswrapper[4952]: I1122 03:06:58.343113 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:06:58 crc kubenswrapper[4952]: I1122 03:06:58.343200 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 03:06:58 crc kubenswrapper[4952]: I1122 03:06:58.344303 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5f35d23af81d1d053b0cb10ef07f55474bcfadceb139bb522d996b063f18401b"} pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:06:58 crc kubenswrapper[4952]: I1122 03:06:58.344409 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" containerID="cri-o://5f35d23af81d1d053b0cb10ef07f55474bcfadceb139bb522d996b063f18401b" gracePeriod=600 Nov 22 03:06:58 crc kubenswrapper[4952]: I1122 03:06:58.502465 4952 generic.go:334] "Generic (PLEG): container finished" podID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerID="5f35d23af81d1d053b0cb10ef07f55474bcfadceb139bb522d996b063f18401b" exitCode=0 Nov 22 03:06:58 crc kubenswrapper[4952]: I1122 03:06:58.502532 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerDied","Data":"5f35d23af81d1d053b0cb10ef07f55474bcfadceb139bb522d996b063f18401b"} Nov 22 03:06:58 crc kubenswrapper[4952]: I1122 03:06:58.502606 4952 scope.go:117] "RemoveContainer" containerID="0cf1c8c9fd6e281870ad88809e9296851217d3eb9921ce023095d72e4315fecb" Nov 22 03:06:59 crc kubenswrapper[4952]: I1122 03:06:59.539765 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"9abb162c6e80f1a9b9ed3e044dff4a6d18eb9dcfbe293208b96a0a02169b6b19"} Nov 22 03:07:02 crc kubenswrapper[4952]: I1122 03:07:02.427183 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lfrxn"] Nov 22 03:07:02 crc kubenswrapper[4952]: E1122 03:07:02.428508 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87fc54ad-677d-4e2e-a0b3-35c46d3ca038" containerName="registry-server" Nov 22 03:07:02 crc kubenswrapper[4952]: I1122 03:07:02.428534 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="87fc54ad-677d-4e2e-a0b3-35c46d3ca038" containerName="registry-server" Nov 22 03:07:02 crc kubenswrapper[4952]: I1122 03:07:02.428788 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="87fc54ad-677d-4e2e-a0b3-35c46d3ca038" containerName="registry-server" Nov 22 03:07:02 crc kubenswrapper[4952]: I1122 03:07:02.430295 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfrxn" Nov 22 03:07:02 crc kubenswrapper[4952]: I1122 03:07:02.443628 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lfrxn"] Nov 22 03:07:02 crc kubenswrapper[4952]: I1122 03:07:02.480948 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1d3c44-b627-4108-b478-f083873e46a5-utilities\") pod \"community-operators-lfrxn\" (UID: \"cc1d3c44-b627-4108-b478-f083873e46a5\") " pod="openshift-marketplace/community-operators-lfrxn" Nov 22 03:07:02 crc kubenswrapper[4952]: I1122 03:07:02.481044 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77s5w\" (UniqueName: \"kubernetes.io/projected/cc1d3c44-b627-4108-b478-f083873e46a5-kube-api-access-77s5w\") pod \"community-operators-lfrxn\" (UID: \"cc1d3c44-b627-4108-b478-f083873e46a5\") " pod="openshift-marketplace/community-operators-lfrxn" Nov 22 03:07:02 crc kubenswrapper[4952]: I1122 03:07:02.481110 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1d3c44-b627-4108-b478-f083873e46a5-catalog-content\") pod \"community-operators-lfrxn\" (UID: \"cc1d3c44-b627-4108-b478-f083873e46a5\") " pod="openshift-marketplace/community-operators-lfrxn" Nov 22 03:07:02 crc kubenswrapper[4952]: I1122 03:07:02.583146 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77s5w\" (UniqueName: \"kubernetes.io/projected/cc1d3c44-b627-4108-b478-f083873e46a5-kube-api-access-77s5w\") pod \"community-operators-lfrxn\" (UID: \"cc1d3c44-b627-4108-b478-f083873e46a5\") " pod="openshift-marketplace/community-operators-lfrxn" Nov 22 03:07:02 crc kubenswrapper[4952]: I1122 03:07:02.583247 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1d3c44-b627-4108-b478-f083873e46a5-catalog-content\") pod \"community-operators-lfrxn\" (UID: \"cc1d3c44-b627-4108-b478-f083873e46a5\") " pod="openshift-marketplace/community-operators-lfrxn" Nov 22 03:07:02 crc kubenswrapper[4952]: I1122 03:07:02.583318 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1d3c44-b627-4108-b478-f083873e46a5-utilities\") pod \"community-operators-lfrxn\" (UID: \"cc1d3c44-b627-4108-b478-f083873e46a5\") " pod="openshift-marketplace/community-operators-lfrxn" Nov 22 03:07:02 crc kubenswrapper[4952]: I1122 03:07:02.584010 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1d3c44-b627-4108-b478-f083873e46a5-utilities\") pod \"community-operators-lfrxn\" (UID: \"cc1d3c44-b627-4108-b478-f083873e46a5\") " pod="openshift-marketplace/community-operators-lfrxn" Nov 22 03:07:02 crc kubenswrapper[4952]: I1122 03:07:02.584393 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1d3c44-b627-4108-b478-f083873e46a5-catalog-content\") pod \"community-operators-lfrxn\" (UID: \"cc1d3c44-b627-4108-b478-f083873e46a5\") " pod="openshift-marketplace/community-operators-lfrxn" Nov 22 03:07:02 crc kubenswrapper[4952]: I1122 03:07:02.611865 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77s5w\" (UniqueName: \"kubernetes.io/projected/cc1d3c44-b627-4108-b478-f083873e46a5-kube-api-access-77s5w\") pod \"community-operators-lfrxn\" (UID: \"cc1d3c44-b627-4108-b478-f083873e46a5\") " pod="openshift-marketplace/community-operators-lfrxn" Nov 22 03:07:02 crc kubenswrapper[4952]: I1122 03:07:02.773826 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfrxn" Nov 22 03:07:03 crc kubenswrapper[4952]: I1122 03:07:03.059883 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lfrxn"] Nov 22 03:07:03 crc kubenswrapper[4952]: I1122 03:07:03.572586 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfrxn" event={"ID":"cc1d3c44-b627-4108-b478-f083873e46a5","Type":"ContainerStarted","Data":"70bff83d0f3fda9061fdd1c2709bde2fe8fb86111db710ef2293537f46ec1777"} Nov 22 03:07:03 crc kubenswrapper[4952]: I1122 03:07:03.649793 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl"] Nov 22 03:07:03 crc kubenswrapper[4952]: I1122 03:07:03.651316 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl" Nov 22 03:07:03 crc kubenswrapper[4952]: I1122 03:07:03.654375 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-jw26j" Nov 22 03:07:03 crc kubenswrapper[4952]: I1122 03:07:03.666033 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl"] Nov 22 03:07:03 crc kubenswrapper[4952]: I1122 03:07:03.702967 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg65g\" (UniqueName: \"kubernetes.io/projected/87e6d798-15ee-44f5-9e53-6bd17839c89d-kube-api-access-sg65g\") pod \"d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl\" (UID: \"87e6d798-15ee-44f5-9e53-6bd17839c89d\") " pod="openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl" Nov 22 03:07:03 crc kubenswrapper[4952]: I1122 03:07:03.703085 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87e6d798-15ee-44f5-9e53-6bd17839c89d-bundle\") pod \"d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl\" (UID: \"87e6d798-15ee-44f5-9e53-6bd17839c89d\") " pod="openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl" Nov 22 03:07:03 crc kubenswrapper[4952]: I1122 03:07:03.703179 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87e6d798-15ee-44f5-9e53-6bd17839c89d-util\") pod \"d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl\" (UID: \"87e6d798-15ee-44f5-9e53-6bd17839c89d\") " pod="openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl" Nov 22 03:07:03 crc kubenswrapper[4952]: I1122 03:07:03.804953 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87e6d798-15ee-44f5-9e53-6bd17839c89d-util\") pod \"d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl\" (UID: \"87e6d798-15ee-44f5-9e53-6bd17839c89d\") " pod="openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl" Nov 22 03:07:03 crc kubenswrapper[4952]: I1122 03:07:03.805039 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg65g\" (UniqueName: \"kubernetes.io/projected/87e6d798-15ee-44f5-9e53-6bd17839c89d-kube-api-access-sg65g\") pod \"d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl\" (UID: \"87e6d798-15ee-44f5-9e53-6bd17839c89d\") " pod="openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl" Nov 22 03:07:03 crc kubenswrapper[4952]: I1122 03:07:03.805089 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87e6d798-15ee-44f5-9e53-6bd17839c89d-bundle\") pod \"d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl\" (UID: \"87e6d798-15ee-44f5-9e53-6bd17839c89d\") " pod="openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl" Nov 22 03:07:03 crc kubenswrapper[4952]: I1122 03:07:03.805602 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87e6d798-15ee-44f5-9e53-6bd17839c89d-util\") pod \"d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl\" (UID: \"87e6d798-15ee-44f5-9e53-6bd17839c89d\") " pod="openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl" Nov 22 03:07:03 crc kubenswrapper[4952]: I1122 03:07:03.805667 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87e6d798-15ee-44f5-9e53-6bd17839c89d-bundle\") pod \"d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl\" (UID: \"87e6d798-15ee-44f5-9e53-6bd17839c89d\") " pod="openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl" Nov 22 03:07:03 crc kubenswrapper[4952]: I1122 03:07:03.834499 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg65g\" (UniqueName: \"kubernetes.io/projected/87e6d798-15ee-44f5-9e53-6bd17839c89d-kube-api-access-sg65g\") pod \"d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl\" (UID: \"87e6d798-15ee-44f5-9e53-6bd17839c89d\") " pod="openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl" Nov 22 03:07:03 crc kubenswrapper[4952]: I1122 03:07:03.968903 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl" Nov 22 03:07:04 crc kubenswrapper[4952]: I1122 03:07:04.443485 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl"] Nov 22 03:07:04 crc kubenswrapper[4952]: I1122 03:07:04.586613 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl" event={"ID":"87e6d798-15ee-44f5-9e53-6bd17839c89d","Type":"ContainerStarted","Data":"4ddbff0f9bd9a229eb6ec553c10f3210fcf5672a28b53d6bc6f7183ff12451a2"} Nov 22 03:07:05 crc kubenswrapper[4952]: I1122 03:07:05.596068 4952 generic.go:334] "Generic (PLEG): container finished" podID="cc1d3c44-b627-4108-b478-f083873e46a5" containerID="96c10778a3775a6c9390ecdb9eb6d0ab0455dae27f4b7b434baf22e3207a98f0" exitCode=0 Nov 22 03:07:05 crc kubenswrapper[4952]: I1122 03:07:05.596213 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfrxn" event={"ID":"cc1d3c44-b627-4108-b478-f083873e46a5","Type":"ContainerDied","Data":"96c10778a3775a6c9390ecdb9eb6d0ab0455dae27f4b7b434baf22e3207a98f0"} Nov 22 03:07:05 crc kubenswrapper[4952]: I1122 03:07:05.602181 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl" event={"ID":"87e6d798-15ee-44f5-9e53-6bd17839c89d","Type":"ContainerDied","Data":"2a75915142f3f8c4c0166f10f53a4244aa3f298dc7d75928b56c3db5affd9bd2"} Nov 22 03:07:05 crc kubenswrapper[4952]: I1122 03:07:05.603131 4952 generic.go:334] "Generic (PLEG): container finished" podID="87e6d798-15ee-44f5-9e53-6bd17839c89d" containerID="2a75915142f3f8c4c0166f10f53a4244aa3f298dc7d75928b56c3db5affd9bd2" exitCode=0 Nov 22 03:07:06 crc kubenswrapper[4952]: I1122 03:07:06.615383 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfrxn" event={"ID":"cc1d3c44-b627-4108-b478-f083873e46a5","Type":"ContainerStarted","Data":"ff3d446e4aafdf9bf8026d87c74fb30c74593fc0f341856ee7a7457a2a271ae5"} Nov 22 03:07:06 crc kubenswrapper[4952]: I1122 03:07:06.619263 4952 generic.go:334] "Generic (PLEG): container finished" podID="87e6d798-15ee-44f5-9e53-6bd17839c89d" containerID="7b3b8e653d28a496b831d87f4c6179ce32f5ba4905a4c23e0250adfbc3b6765f" exitCode=0 Nov 22 03:07:06 crc kubenswrapper[4952]: I1122 03:07:06.619341 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl" event={"ID":"87e6d798-15ee-44f5-9e53-6bd17839c89d","Type":"ContainerDied","Data":"7b3b8e653d28a496b831d87f4c6179ce32f5ba4905a4c23e0250adfbc3b6765f"} Nov 22 03:07:07 crc kubenswrapper[4952]: I1122 03:07:07.628715 4952 generic.go:334] "Generic (PLEG): container finished" podID="cc1d3c44-b627-4108-b478-f083873e46a5" containerID="ff3d446e4aafdf9bf8026d87c74fb30c74593fc0f341856ee7a7457a2a271ae5" exitCode=0 Nov 22 03:07:07 crc kubenswrapper[4952]: I1122 03:07:07.628911 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfrxn" event={"ID":"cc1d3c44-b627-4108-b478-f083873e46a5","Type":"ContainerDied","Data":"ff3d446e4aafdf9bf8026d87c74fb30c74593fc0f341856ee7a7457a2a271ae5"} Nov 22 03:07:07 crc kubenswrapper[4952]: I1122 03:07:07.635328 4952 generic.go:334] "Generic (PLEG): container finished" podID="87e6d798-15ee-44f5-9e53-6bd17839c89d" containerID="afb0678646d1a31795db0dae562559fe5be46dc8b4bd430fd7846f5509cc0fd3" exitCode=0 Nov 22 03:07:07 crc kubenswrapper[4952]: I1122 03:07:07.635359 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl" event={"ID":"87e6d798-15ee-44f5-9e53-6bd17839c89d","Type":"ContainerDied","Data":"afb0678646d1a31795db0dae562559fe5be46dc8b4bd430fd7846f5509cc0fd3"} Nov 22 03:07:08 crc kubenswrapper[4952]: I1122 03:07:08.646811 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfrxn" event={"ID":"cc1d3c44-b627-4108-b478-f083873e46a5","Type":"ContainerStarted","Data":"3ed2c79f92927cc6a3cf0e3f79a65ffaeef38d6f5964022ddc36e709ff0df06b"} Nov 22 03:07:08 crc kubenswrapper[4952]: I1122 03:07:08.983193 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl" Nov 22 03:07:09 crc kubenswrapper[4952]: I1122 03:07:09.003732 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lfrxn" podStartSLOduration=4.583918276 podStartE2EDuration="7.003713031s" podCreationTimestamp="2025-11-22 03:07:02 +0000 UTC" firstStartedPulling="2025-11-22 03:07:05.600341494 +0000 UTC m=+789.906358797" lastFinishedPulling="2025-11-22 03:07:08.020136279 +0000 UTC m=+792.326153552" observedRunningTime="2025-11-22 03:07:08.665912529 +0000 UTC m=+792.971929802" watchObservedRunningTime="2025-11-22 03:07:09.003713031 +0000 UTC m=+793.309730304" Nov 22 03:07:09 crc kubenswrapper[4952]: I1122 03:07:09.092358 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg65g\" (UniqueName: \"kubernetes.io/projected/87e6d798-15ee-44f5-9e53-6bd17839c89d-kube-api-access-sg65g\") pod \"87e6d798-15ee-44f5-9e53-6bd17839c89d\" (UID: \"87e6d798-15ee-44f5-9e53-6bd17839c89d\") " Nov 22 03:07:09 crc kubenswrapper[4952]: I1122 03:07:09.092495 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87e6d798-15ee-44f5-9e53-6bd17839c89d-bundle\") pod \"87e6d798-15ee-44f5-9e53-6bd17839c89d\" (UID: \"87e6d798-15ee-44f5-9e53-6bd17839c89d\") " Nov 22 03:07:09 crc kubenswrapper[4952]: I1122 03:07:09.092553 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87e6d798-15ee-44f5-9e53-6bd17839c89d-util\") pod \"87e6d798-15ee-44f5-9e53-6bd17839c89d\" (UID: \"87e6d798-15ee-44f5-9e53-6bd17839c89d\") " Nov 22 03:07:09 crc kubenswrapper[4952]: I1122 03:07:09.093606 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87e6d798-15ee-44f5-9e53-6bd17839c89d-bundle" (OuterVolumeSpecName: "bundle") pod "87e6d798-15ee-44f5-9e53-6bd17839c89d" (UID: "87e6d798-15ee-44f5-9e53-6bd17839c89d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:07:09 crc kubenswrapper[4952]: I1122 03:07:09.099072 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87e6d798-15ee-44f5-9e53-6bd17839c89d-kube-api-access-sg65g" (OuterVolumeSpecName: "kube-api-access-sg65g") pod "87e6d798-15ee-44f5-9e53-6bd17839c89d" (UID: "87e6d798-15ee-44f5-9e53-6bd17839c89d"). InnerVolumeSpecName "kube-api-access-sg65g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:07:09 crc kubenswrapper[4952]: I1122 03:07:09.107301 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87e6d798-15ee-44f5-9e53-6bd17839c89d-util" (OuterVolumeSpecName: "util") pod "87e6d798-15ee-44f5-9e53-6bd17839c89d" (UID: "87e6d798-15ee-44f5-9e53-6bd17839c89d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:07:09 crc kubenswrapper[4952]: I1122 03:07:09.194318 4952 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87e6d798-15ee-44f5-9e53-6bd17839c89d-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:09 crc kubenswrapper[4952]: I1122 03:07:09.194380 4952 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87e6d798-15ee-44f5-9e53-6bd17839c89d-util\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:09 crc kubenswrapper[4952]: I1122 03:07:09.194411 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg65g\" (UniqueName: \"kubernetes.io/projected/87e6d798-15ee-44f5-9e53-6bd17839c89d-kube-api-access-sg65g\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:09 crc kubenswrapper[4952]: I1122 03:07:09.660659 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl" event={"ID":"87e6d798-15ee-44f5-9e53-6bd17839c89d","Type":"ContainerDied","Data":"4ddbff0f9bd9a229eb6ec553c10f3210fcf5672a28b53d6bc6f7183ff12451a2"} Nov 22 03:07:09 crc kubenswrapper[4952]: I1122 03:07:09.660715 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl" Nov 22 03:07:09 crc kubenswrapper[4952]: I1122 03:07:09.660732 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ddbff0f9bd9a229eb6ec553c10f3210fcf5672a28b53d6bc6f7183ff12451a2" Nov 22 03:07:12 crc kubenswrapper[4952]: I1122 03:07:12.774031 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lfrxn" Nov 22 03:07:12 crc kubenswrapper[4952]: I1122 03:07:12.774582 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lfrxn" Nov 22 03:07:12 crc kubenswrapper[4952]: I1122 03:07:12.818203 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lfrxn" Nov 22 03:07:13 crc kubenswrapper[4952]: I1122 03:07:13.735480 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lfrxn" Nov 22 03:07:14 crc kubenswrapper[4952]: I1122 03:07:14.270631 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-66c8cfd656-bvmf8"] Nov 22 03:07:14 crc kubenswrapper[4952]: E1122 03:07:14.271372 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87e6d798-15ee-44f5-9e53-6bd17839c89d" containerName="pull" Nov 22 03:07:14 crc kubenswrapper[4952]: I1122 03:07:14.271389 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e6d798-15ee-44f5-9e53-6bd17839c89d" containerName="pull" Nov 22 03:07:14 crc kubenswrapper[4952]: E1122 03:07:14.271400 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87e6d798-15ee-44f5-9e53-6bd17839c89d" containerName="util" Nov 22 03:07:14 crc kubenswrapper[4952]: I1122 03:07:14.271410 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e6d798-15ee-44f5-9e53-6bd17839c89d" containerName="util" Nov 22 03:07:14 crc kubenswrapper[4952]: E1122 03:07:14.271435 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87e6d798-15ee-44f5-9e53-6bd17839c89d" containerName="extract" Nov 22 03:07:14 crc kubenswrapper[4952]: I1122 03:07:14.271443 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e6d798-15ee-44f5-9e53-6bd17839c89d" containerName="extract" Nov 22 03:07:14 crc kubenswrapper[4952]: I1122 03:07:14.271630 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="87e6d798-15ee-44f5-9e53-6bd17839c89d" containerName="extract" Nov 22 03:07:14 crc kubenswrapper[4952]: I1122 03:07:14.272527 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-66c8cfd656-bvmf8" Nov 22 03:07:14 crc kubenswrapper[4952]: I1122 03:07:14.275050 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-2ccjs" Nov 22 03:07:14 crc kubenswrapper[4952]: I1122 03:07:14.376942 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skdbp\" (UniqueName: \"kubernetes.io/projected/79375e72-ee47-4b49-95aa-bcf8e211145f-kube-api-access-skdbp\") pod \"openstack-operator-controller-operator-66c8cfd656-bvmf8\" (UID: \"79375e72-ee47-4b49-95aa-bcf8e211145f\") " pod="openstack-operators/openstack-operator-controller-operator-66c8cfd656-bvmf8" Nov 22 03:07:14 crc kubenswrapper[4952]: I1122 03:07:14.382359 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-66c8cfd656-bvmf8"] Nov 22 03:07:14 crc kubenswrapper[4952]: I1122 03:07:14.478151 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skdbp\" (UniqueName: \"kubernetes.io/projected/79375e72-ee47-4b49-95aa-bcf8e211145f-kube-api-access-skdbp\") pod \"openstack-operator-controller-operator-66c8cfd656-bvmf8\" (UID: \"79375e72-ee47-4b49-95aa-bcf8e211145f\") " pod="openstack-operators/openstack-operator-controller-operator-66c8cfd656-bvmf8" Nov 22 03:07:14 crc kubenswrapper[4952]: I1122 03:07:14.501681 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skdbp\" (UniqueName: \"kubernetes.io/projected/79375e72-ee47-4b49-95aa-bcf8e211145f-kube-api-access-skdbp\") pod \"openstack-operator-controller-operator-66c8cfd656-bvmf8\" (UID: \"79375e72-ee47-4b49-95aa-bcf8e211145f\") " pod="openstack-operators/openstack-operator-controller-operator-66c8cfd656-bvmf8" Nov 22 03:07:14 crc kubenswrapper[4952]: I1122 03:07:14.596425 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-66c8cfd656-bvmf8" Nov 22 03:07:15 crc kubenswrapper[4952]: I1122 03:07:15.071589 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-66c8cfd656-bvmf8"] Nov 22 03:07:15 crc kubenswrapper[4952]: W1122 03:07:15.085060 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79375e72_ee47_4b49_95aa_bcf8e211145f.slice/crio-6309ed039ce05cd6c60dbec73605d9d8b628e534830521b3bad88768682ac5f6 WatchSource:0}: Error finding container 6309ed039ce05cd6c60dbec73605d9d8b628e534830521b3bad88768682ac5f6: Status 404 returned error can't find the container with id 6309ed039ce05cd6c60dbec73605d9d8b628e534830521b3bad88768682ac5f6 Nov 22 03:07:15 crc kubenswrapper[4952]: I1122 03:07:15.709104 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-66c8cfd656-bvmf8" event={"ID":"79375e72-ee47-4b49-95aa-bcf8e211145f","Type":"ContainerStarted","Data":"6309ed039ce05cd6c60dbec73605d9d8b628e534830521b3bad88768682ac5f6"} Nov 22 03:07:15 crc kubenswrapper[4952]: I1122 03:07:15.811628 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lfrxn"] Nov 22 03:07:15 crc kubenswrapper[4952]: I1122 03:07:15.812761 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lfrxn" podUID="cc1d3c44-b627-4108-b478-f083873e46a5" containerName="registry-server" containerID="cri-o://3ed2c79f92927cc6a3cf0e3f79a65ffaeef38d6f5964022ddc36e709ff0df06b" gracePeriod=2 Nov 22 03:07:16 crc kubenswrapper[4952]: I1122 03:07:16.241510 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfrxn" Nov 22 03:07:16 crc kubenswrapper[4952]: I1122 03:07:16.332130 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1d3c44-b627-4108-b478-f083873e46a5-catalog-content\") pod \"cc1d3c44-b627-4108-b478-f083873e46a5\" (UID: \"cc1d3c44-b627-4108-b478-f083873e46a5\") " Nov 22 03:07:16 crc kubenswrapper[4952]: I1122 03:07:16.332200 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1d3c44-b627-4108-b478-f083873e46a5-utilities\") pod \"cc1d3c44-b627-4108-b478-f083873e46a5\" (UID: \"cc1d3c44-b627-4108-b478-f083873e46a5\") " Nov 22 03:07:16 crc kubenswrapper[4952]: I1122 03:07:16.332268 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77s5w\" (UniqueName: \"kubernetes.io/projected/cc1d3c44-b627-4108-b478-f083873e46a5-kube-api-access-77s5w\") pod \"cc1d3c44-b627-4108-b478-f083873e46a5\" (UID: \"cc1d3c44-b627-4108-b478-f083873e46a5\") " Nov 22 03:07:16 crc kubenswrapper[4952]: I1122 03:07:16.339506 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc1d3c44-b627-4108-b478-f083873e46a5-utilities" (OuterVolumeSpecName: "utilities") pod "cc1d3c44-b627-4108-b478-f083873e46a5" (UID: "cc1d3c44-b627-4108-b478-f083873e46a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:07:16 crc kubenswrapper[4952]: I1122 03:07:16.339702 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc1d3c44-b627-4108-b478-f083873e46a5-kube-api-access-77s5w" (OuterVolumeSpecName: "kube-api-access-77s5w") pod "cc1d3c44-b627-4108-b478-f083873e46a5" (UID: "cc1d3c44-b627-4108-b478-f083873e46a5"). InnerVolumeSpecName "kube-api-access-77s5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:07:16 crc kubenswrapper[4952]: I1122 03:07:16.415258 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc1d3c44-b627-4108-b478-f083873e46a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc1d3c44-b627-4108-b478-f083873e46a5" (UID: "cc1d3c44-b627-4108-b478-f083873e46a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:07:16 crc kubenswrapper[4952]: I1122 03:07:16.433879 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77s5w\" (UniqueName: \"kubernetes.io/projected/cc1d3c44-b627-4108-b478-f083873e46a5-kube-api-access-77s5w\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:16 crc kubenswrapper[4952]: I1122 03:07:16.434241 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1d3c44-b627-4108-b478-f083873e46a5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:16 crc kubenswrapper[4952]: I1122 03:07:16.434254 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1d3c44-b627-4108-b478-f083873e46a5-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:16 crc kubenswrapper[4952]: I1122 03:07:16.729878 4952 generic.go:334] "Generic (PLEG): container finished" podID="cc1d3c44-b627-4108-b478-f083873e46a5" containerID="3ed2c79f92927cc6a3cf0e3f79a65ffaeef38d6f5964022ddc36e709ff0df06b" exitCode=0 Nov 22 03:07:16 crc kubenswrapper[4952]: I1122 03:07:16.729940 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfrxn" event={"ID":"cc1d3c44-b627-4108-b478-f083873e46a5","Type":"ContainerDied","Data":"3ed2c79f92927cc6a3cf0e3f79a65ffaeef38d6f5964022ddc36e709ff0df06b"} Nov 22 03:07:16 crc kubenswrapper[4952]: I1122 03:07:16.729997 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfrxn" Nov 22 03:07:16 crc kubenswrapper[4952]: I1122 03:07:16.730023 4952 scope.go:117] "RemoveContainer" containerID="3ed2c79f92927cc6a3cf0e3f79a65ffaeef38d6f5964022ddc36e709ff0df06b" Nov 22 03:07:16 crc kubenswrapper[4952]: I1122 03:07:16.730008 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfrxn" event={"ID":"cc1d3c44-b627-4108-b478-f083873e46a5","Type":"ContainerDied","Data":"70bff83d0f3fda9061fdd1c2709bde2fe8fb86111db710ef2293537f46ec1777"} Nov 22 03:07:16 crc kubenswrapper[4952]: I1122 03:07:16.751470 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lfrxn"] Nov 22 03:07:16 crc kubenswrapper[4952]: I1122 03:07:16.756392 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lfrxn"] Nov 22 03:07:18 crc kubenswrapper[4952]: I1122 03:07:18.331394 4952 scope.go:117] "RemoveContainer" containerID="ff3d446e4aafdf9bf8026d87c74fb30c74593fc0f341856ee7a7457a2a271ae5" Nov 22 03:07:18 crc kubenswrapper[4952]: I1122 03:07:18.540724 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc1d3c44-b627-4108-b478-f083873e46a5" path="/var/lib/kubelet/pods/cc1d3c44-b627-4108-b478-f083873e46a5/volumes" Nov 22 03:07:18 crc kubenswrapper[4952]: I1122 03:07:18.919212 4952 scope.go:117] "RemoveContainer" containerID="96c10778a3775a6c9390ecdb9eb6d0ab0455dae27f4b7b434baf22e3207a98f0" Nov 22 03:07:19 crc kubenswrapper[4952]: I1122 03:07:19.004695 4952 scope.go:117] "RemoveContainer" containerID="3ed2c79f92927cc6a3cf0e3f79a65ffaeef38d6f5964022ddc36e709ff0df06b" Nov 22 03:07:19 crc kubenswrapper[4952]: E1122 03:07:19.005412 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ed2c79f92927cc6a3cf0e3f79a65ffaeef38d6f5964022ddc36e709ff0df06b\": container with ID starting with 3ed2c79f92927cc6a3cf0e3f79a65ffaeef38d6f5964022ddc36e709ff0df06b not found: ID does not exist" containerID="3ed2c79f92927cc6a3cf0e3f79a65ffaeef38d6f5964022ddc36e709ff0df06b" Nov 22 03:07:19 crc kubenswrapper[4952]: I1122 03:07:19.005475 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ed2c79f92927cc6a3cf0e3f79a65ffaeef38d6f5964022ddc36e709ff0df06b"} err="failed to get container status \"3ed2c79f92927cc6a3cf0e3f79a65ffaeef38d6f5964022ddc36e709ff0df06b\": rpc error: code = NotFound desc = could not find container \"3ed2c79f92927cc6a3cf0e3f79a65ffaeef38d6f5964022ddc36e709ff0df06b\": container with ID starting with 3ed2c79f92927cc6a3cf0e3f79a65ffaeef38d6f5964022ddc36e709ff0df06b not found: ID does not exist" Nov 22 03:07:19 crc kubenswrapper[4952]: I1122 03:07:19.005513 4952 scope.go:117] "RemoveContainer" containerID="ff3d446e4aafdf9bf8026d87c74fb30c74593fc0f341856ee7a7457a2a271ae5" Nov 22 03:07:19 crc kubenswrapper[4952]: E1122 03:07:19.006012 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff3d446e4aafdf9bf8026d87c74fb30c74593fc0f341856ee7a7457a2a271ae5\": container with ID starting with ff3d446e4aafdf9bf8026d87c74fb30c74593fc0f341856ee7a7457a2a271ae5 not found: ID does not exist" containerID="ff3d446e4aafdf9bf8026d87c74fb30c74593fc0f341856ee7a7457a2a271ae5" Nov 22 03:07:19 crc kubenswrapper[4952]: I1122 03:07:19.006060 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff3d446e4aafdf9bf8026d87c74fb30c74593fc0f341856ee7a7457a2a271ae5"} err="failed to get container status \"ff3d446e4aafdf9bf8026d87c74fb30c74593fc0f341856ee7a7457a2a271ae5\": rpc error: code = NotFound desc = could not find container \"ff3d446e4aafdf9bf8026d87c74fb30c74593fc0f341856ee7a7457a2a271ae5\": container with ID starting with ff3d446e4aafdf9bf8026d87c74fb30c74593fc0f341856ee7a7457a2a271ae5 not found: ID does not exist" Nov 22 03:07:19 crc kubenswrapper[4952]: I1122 03:07:19.006079 4952 scope.go:117] "RemoveContainer" containerID="96c10778a3775a6c9390ecdb9eb6d0ab0455dae27f4b7b434baf22e3207a98f0" Nov 22 03:07:19 crc kubenswrapper[4952]: E1122 03:07:19.006396 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96c10778a3775a6c9390ecdb9eb6d0ab0455dae27f4b7b434baf22e3207a98f0\": container with ID starting with 96c10778a3775a6c9390ecdb9eb6d0ab0455dae27f4b7b434baf22e3207a98f0 not found: ID does not exist" containerID="96c10778a3775a6c9390ecdb9eb6d0ab0455dae27f4b7b434baf22e3207a98f0" Nov 22 03:07:19 crc kubenswrapper[4952]: I1122 03:07:19.006444 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c10778a3775a6c9390ecdb9eb6d0ab0455dae27f4b7b434baf22e3207a98f0"} err="failed to get container status \"96c10778a3775a6c9390ecdb9eb6d0ab0455dae27f4b7b434baf22e3207a98f0\": rpc error: code = NotFound desc = could not find container \"96c10778a3775a6c9390ecdb9eb6d0ab0455dae27f4b7b434baf22e3207a98f0\": container with ID starting with 96c10778a3775a6c9390ecdb9eb6d0ab0455dae27f4b7b434baf22e3207a98f0 not found: ID does not exist" Nov 22 03:07:19 crc kubenswrapper[4952]: I1122 03:07:19.760389 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-66c8cfd656-bvmf8" event={"ID":"79375e72-ee47-4b49-95aa-bcf8e211145f","Type":"ContainerStarted","Data":"af9b5a70339322b619a59ad0de70e65a588bd1f4222b3f36b3fdce607e7df023"} Nov 22 03:07:21 crc kubenswrapper[4952]: I1122 03:07:21.775574 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-66c8cfd656-bvmf8" event={"ID":"79375e72-ee47-4b49-95aa-bcf8e211145f","Type":"ContainerStarted","Data":"e08dbcf52b915aeb96cf39582e1ce645304de2daee665f51fe0cebe26f0fa4e6"} Nov 22 03:07:21 crc kubenswrapper[4952]: I1122 03:07:21.775979 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-66c8cfd656-bvmf8" Nov 22 03:07:21 crc kubenswrapper[4952]: I1122 03:07:21.805585 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-66c8cfd656-bvmf8" podStartSLOduration=1.630439634 podStartE2EDuration="7.805509265s" podCreationTimestamp="2025-11-22 03:07:14 +0000 UTC" firstStartedPulling="2025-11-22 03:07:15.088259292 +0000 UTC m=+799.394276565" lastFinishedPulling="2025-11-22 03:07:21.263328923 +0000 UTC m=+805.569346196" observedRunningTime="2025-11-22 03:07:21.803357118 +0000 UTC m=+806.109374391" watchObservedRunningTime="2025-11-22 03:07:21.805509265 +0000 UTC m=+806.111526578" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.600508 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-66c8cfd656-bvmf8" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.620164 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wbw6s"] Nov 22 03:07:24 crc kubenswrapper[4952]: E1122 03:07:24.620824 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc1d3c44-b627-4108-b478-f083873e46a5" containerName="extract-content" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.620849 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc1d3c44-b627-4108-b478-f083873e46a5" containerName="extract-content" Nov 22 03:07:24 crc kubenswrapper[4952]: E1122 03:07:24.620863 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc1d3c44-b627-4108-b478-f083873e46a5" containerName="registry-server" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.620876 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc1d3c44-b627-4108-b478-f083873e46a5" containerName="registry-server" Nov 22 03:07:24 crc kubenswrapper[4952]: E1122 03:07:24.620895 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc1d3c44-b627-4108-b478-f083873e46a5" containerName="extract-utilities" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.620905 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc1d3c44-b627-4108-b478-f083873e46a5" containerName="extract-utilities" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.621109 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc1d3c44-b627-4108-b478-f083873e46a5" containerName="registry-server" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.622315 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbw6s" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.638370 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wbw6s"] Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.724148 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjd4w\" (UniqueName: \"kubernetes.io/projected/cae57b38-9d8a-4c42-93c6-4811d0ffe144-kube-api-access-sjd4w\") pod \"certified-operators-wbw6s\" (UID: \"cae57b38-9d8a-4c42-93c6-4811d0ffe144\") " pod="openshift-marketplace/certified-operators-wbw6s" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.724280 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cae57b38-9d8a-4c42-93c6-4811d0ffe144-utilities\") pod \"certified-operators-wbw6s\" (UID: \"cae57b38-9d8a-4c42-93c6-4811d0ffe144\") " pod="openshift-marketplace/certified-operators-wbw6s" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.724313 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cae57b38-9d8a-4c42-93c6-4811d0ffe144-catalog-content\") pod \"certified-operators-wbw6s\" (UID: \"cae57b38-9d8a-4c42-93c6-4811d0ffe144\") " pod="openshift-marketplace/certified-operators-wbw6s" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.813774 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pbzxf"] Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.815154 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pbzxf" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.826578 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjd4w\" (UniqueName: \"kubernetes.io/projected/cae57b38-9d8a-4c42-93c6-4811d0ffe144-kube-api-access-sjd4w\") pod \"certified-operators-wbw6s\" (UID: \"cae57b38-9d8a-4c42-93c6-4811d0ffe144\") " pod="openshift-marketplace/certified-operators-wbw6s" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.826699 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cae57b38-9d8a-4c42-93c6-4811d0ffe144-utilities\") pod \"certified-operators-wbw6s\" (UID: \"cae57b38-9d8a-4c42-93c6-4811d0ffe144\") " pod="openshift-marketplace/certified-operators-wbw6s" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.826739 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cae57b38-9d8a-4c42-93c6-4811d0ffe144-catalog-content\") pod \"certified-operators-wbw6s\" (UID: \"cae57b38-9d8a-4c42-93c6-4811d0ffe144\") " pod="openshift-marketplace/certified-operators-wbw6s" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.827259 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cae57b38-9d8a-4c42-93c6-4811d0ffe144-utilities\") pod \"certified-operators-wbw6s\" (UID: \"cae57b38-9d8a-4c42-93c6-4811d0ffe144\") " pod="openshift-marketplace/certified-operators-wbw6s" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.827370 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cae57b38-9d8a-4c42-93c6-4811d0ffe144-catalog-content\") pod \"certified-operators-wbw6s\" (UID: \"cae57b38-9d8a-4c42-93c6-4811d0ffe144\") " pod="openshift-marketplace/certified-operators-wbw6s" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.841670 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pbzxf"] Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.861331 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjd4w\" (UniqueName: \"kubernetes.io/projected/cae57b38-9d8a-4c42-93c6-4811d0ffe144-kube-api-access-sjd4w\") pod \"certified-operators-wbw6s\" (UID: \"cae57b38-9d8a-4c42-93c6-4811d0ffe144\") " pod="openshift-marketplace/certified-operators-wbw6s" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.928533 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1b715d4-384c-40b3-a393-b880ae580966-catalog-content\") pod \"redhat-marketplace-pbzxf\" (UID: \"c1b715d4-384c-40b3-a393-b880ae580966\") " pod="openshift-marketplace/redhat-marketplace-pbzxf" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.928689 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1b715d4-384c-40b3-a393-b880ae580966-utilities\") pod \"redhat-marketplace-pbzxf\" (UID: \"c1b715d4-384c-40b3-a393-b880ae580966\") " pod="openshift-marketplace/redhat-marketplace-pbzxf" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.928733 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k5kg\" (UniqueName: \"kubernetes.io/projected/c1b715d4-384c-40b3-a393-b880ae580966-kube-api-access-7k5kg\") pod \"redhat-marketplace-pbzxf\" (UID: \"c1b715d4-384c-40b3-a393-b880ae580966\") " pod="openshift-marketplace/redhat-marketplace-pbzxf" Nov 22 03:07:24 crc kubenswrapper[4952]: I1122 03:07:24.952633 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbw6s" Nov 22 03:07:25 crc kubenswrapper[4952]: I1122 03:07:25.030255 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1b715d4-384c-40b3-a393-b880ae580966-catalog-content\") pod \"redhat-marketplace-pbzxf\" (UID: \"c1b715d4-384c-40b3-a393-b880ae580966\") " pod="openshift-marketplace/redhat-marketplace-pbzxf" Nov 22 03:07:25 crc kubenswrapper[4952]: I1122 03:07:25.030419 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1b715d4-384c-40b3-a393-b880ae580966-utilities\") pod \"redhat-marketplace-pbzxf\" (UID: \"c1b715d4-384c-40b3-a393-b880ae580966\") " pod="openshift-marketplace/redhat-marketplace-pbzxf" Nov 22 03:07:25 crc kubenswrapper[4952]: I1122 03:07:25.030481 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k5kg\" (UniqueName: \"kubernetes.io/projected/c1b715d4-384c-40b3-a393-b880ae580966-kube-api-access-7k5kg\") pod \"redhat-marketplace-pbzxf\" (UID: \"c1b715d4-384c-40b3-a393-b880ae580966\") " pod="openshift-marketplace/redhat-marketplace-pbzxf" Nov 22 03:07:25 crc kubenswrapper[4952]: I1122 03:07:25.031276 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1b715d4-384c-40b3-a393-b880ae580966-utilities\") pod \"redhat-marketplace-pbzxf\" (UID: \"c1b715d4-384c-40b3-a393-b880ae580966\") " pod="openshift-marketplace/redhat-marketplace-pbzxf" Nov 22 03:07:25 crc kubenswrapper[4952]: I1122 03:07:25.031471 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1b715d4-384c-40b3-a393-b880ae580966-catalog-content\") pod \"redhat-marketplace-pbzxf\" (UID: \"c1b715d4-384c-40b3-a393-b880ae580966\") " pod="openshift-marketplace/redhat-marketplace-pbzxf" Nov 22 03:07:25 crc kubenswrapper[4952]: I1122 03:07:25.058406 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k5kg\" (UniqueName: \"kubernetes.io/projected/c1b715d4-384c-40b3-a393-b880ae580966-kube-api-access-7k5kg\") pod \"redhat-marketplace-pbzxf\" (UID: \"c1b715d4-384c-40b3-a393-b880ae580966\") " pod="openshift-marketplace/redhat-marketplace-pbzxf" Nov 22 03:07:25 crc kubenswrapper[4952]: I1122 03:07:25.157635 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pbzxf" Nov 22 03:07:25 crc kubenswrapper[4952]: I1122 03:07:25.517063 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wbw6s"] Nov 22 03:07:25 crc kubenswrapper[4952]: W1122 03:07:25.527587 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcae57b38_9d8a_4c42_93c6_4811d0ffe144.slice/crio-ab20e65fc5ab2f03e52f5803f66dfff37eca67aa4e34158750d9b5b0cac78134 WatchSource:0}: Error finding container ab20e65fc5ab2f03e52f5803f66dfff37eca67aa4e34158750d9b5b0cac78134: Status 404 returned error can't find the container with id ab20e65fc5ab2f03e52f5803f66dfff37eca67aa4e34158750d9b5b0cac78134 Nov 22 03:07:25 crc kubenswrapper[4952]: I1122 03:07:25.554204 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pbzxf"] Nov 22 03:07:25 crc kubenswrapper[4952]: W1122 03:07:25.558630 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1b715d4_384c_40b3_a393_b880ae580966.slice/crio-63aa574447f9cd27a035a6218c7451076b98e4c88130657ed4b87bac84434623 WatchSource:0}: Error finding container 63aa574447f9cd27a035a6218c7451076b98e4c88130657ed4b87bac84434623: Status 404 returned error can't find the container with id 63aa574447f9cd27a035a6218c7451076b98e4c88130657ed4b87bac84434623 Nov 22 03:07:25 crc kubenswrapper[4952]: I1122 03:07:25.808599 4952 generic.go:334] "Generic (PLEG): container finished" podID="cae57b38-9d8a-4c42-93c6-4811d0ffe144" containerID="d3d3862a1f95a0519ffbc84e51265036319dea4e77c041285903fcf98750460d" exitCode=0 Nov 22 03:07:25 crc kubenswrapper[4952]: I1122 03:07:25.808701 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbw6s" event={"ID":"cae57b38-9d8a-4c42-93c6-4811d0ffe144","Type":"ContainerDied","Data":"d3d3862a1f95a0519ffbc84e51265036319dea4e77c041285903fcf98750460d"} Nov 22 03:07:25 crc kubenswrapper[4952]: I1122 03:07:25.808736 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbw6s" event={"ID":"cae57b38-9d8a-4c42-93c6-4811d0ffe144","Type":"ContainerStarted","Data":"ab20e65fc5ab2f03e52f5803f66dfff37eca67aa4e34158750d9b5b0cac78134"} Nov 22 03:07:25 crc kubenswrapper[4952]: I1122 03:07:25.811787 4952 generic.go:334] "Generic (PLEG): container finished" podID="c1b715d4-384c-40b3-a393-b880ae580966" containerID="de71cf01e1180f2c7e410e40518ffd555fedd7c0615418ceb45e9b6b27c696d5" exitCode=0 Nov 22 03:07:25 crc kubenswrapper[4952]: I1122 03:07:25.811861 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbzxf" event={"ID":"c1b715d4-384c-40b3-a393-b880ae580966","Type":"ContainerDied","Data":"de71cf01e1180f2c7e410e40518ffd555fedd7c0615418ceb45e9b6b27c696d5"} Nov 22 03:07:25 crc kubenswrapper[4952]: I1122 03:07:25.811898 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbzxf" event={"ID":"c1b715d4-384c-40b3-a393-b880ae580966","Type":"ContainerStarted","Data":"63aa574447f9cd27a035a6218c7451076b98e4c88130657ed4b87bac84434623"} Nov 22 03:07:26 crc kubenswrapper[4952]: I1122 03:07:26.821070 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbzxf" event={"ID":"c1b715d4-384c-40b3-a393-b880ae580966","Type":"ContainerStarted","Data":"c895ba1bd6b2259bbeb492d5c353b2711e969e64d4ae100ea991b8a91ad0744e"} Nov 22 03:07:26 crc kubenswrapper[4952]: I1122 03:07:26.824133 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbw6s" event={"ID":"cae57b38-9d8a-4c42-93c6-4811d0ffe144","Type":"ContainerStarted","Data":"68c175b5dd1047e4fa256655a03cf31b8238929a3c2fa7ac1d8c80e511e2eca8"} Nov 22 03:07:27 crc kubenswrapper[4952]: I1122 03:07:27.833657 4952 generic.go:334] "Generic (PLEG): container finished" podID="c1b715d4-384c-40b3-a393-b880ae580966" containerID="c895ba1bd6b2259bbeb492d5c353b2711e969e64d4ae100ea991b8a91ad0744e" exitCode=0 Nov 22 03:07:27 crc kubenswrapper[4952]: I1122 03:07:27.833751 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbzxf" event={"ID":"c1b715d4-384c-40b3-a393-b880ae580966","Type":"ContainerDied","Data":"c895ba1bd6b2259bbeb492d5c353b2711e969e64d4ae100ea991b8a91ad0744e"} Nov 22 03:07:27 crc kubenswrapper[4952]: I1122 03:07:27.837000 4952 generic.go:334] "Generic (PLEG): container finished" podID="cae57b38-9d8a-4c42-93c6-4811d0ffe144" containerID="68c175b5dd1047e4fa256655a03cf31b8238929a3c2fa7ac1d8c80e511e2eca8" exitCode=0 Nov 22 03:07:27 crc kubenswrapper[4952]: I1122 03:07:27.837065 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbw6s" event={"ID":"cae57b38-9d8a-4c42-93c6-4811d0ffe144","Type":"ContainerDied","Data":"68c175b5dd1047e4fa256655a03cf31b8238929a3c2fa7ac1d8c80e511e2eca8"} Nov 22 03:07:28 crc kubenswrapper[4952]: I1122 03:07:28.846807 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbw6s" event={"ID":"cae57b38-9d8a-4c42-93c6-4811d0ffe144","Type":"ContainerStarted","Data":"9357d48d6f2d26e5d6c7c387a102ab01098d8cd2913910d6fb107fe7a0d568e6"} Nov 22 03:07:28 crc kubenswrapper[4952]: I1122 03:07:28.851209 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbzxf" event={"ID":"c1b715d4-384c-40b3-a393-b880ae580966","Type":"ContainerStarted","Data":"33a8dfd5ec2ae59a45aaf811edf60742bf76a9545c19fd462c5a24f901a45d87"} Nov 22 03:07:28 crc kubenswrapper[4952]: I1122 03:07:28.873459 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wbw6s" podStartSLOduration=2.408754054 podStartE2EDuration="4.873432283s" podCreationTimestamp="2025-11-22 03:07:24 +0000 UTC" firstStartedPulling="2025-11-22 03:07:25.810624012 +0000 UTC m=+810.116641285" lastFinishedPulling="2025-11-22 03:07:28.275302241 +0000 UTC m=+812.581319514" observedRunningTime="2025-11-22 03:07:28.868212675 +0000 UTC m=+813.174229948" watchObservedRunningTime="2025-11-22 03:07:28.873432283 +0000 UTC m=+813.179449576" Nov 22 03:07:28 crc kubenswrapper[4952]: I1122 03:07:28.892102 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pbzxf" podStartSLOduration=2.474044912 podStartE2EDuration="4.89208159s" podCreationTimestamp="2025-11-22 03:07:24 +0000 UTC" firstStartedPulling="2025-11-22 03:07:25.813379005 +0000 UTC m=+810.119396268" lastFinishedPulling="2025-11-22 03:07:28.231415633 +0000 UTC m=+812.537432946" observedRunningTime="2025-11-22 03:07:28.890156968 +0000 UTC m=+813.196174241" watchObservedRunningTime="2025-11-22 03:07:28.89208159 +0000 UTC m=+813.198098863" Nov 22 03:07:31 crc kubenswrapper[4952]: I1122 03:07:31.023841 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nvvk8"] Nov 22 03:07:31 crc kubenswrapper[4952]: I1122 03:07:31.025941 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nvvk8" Nov 22 03:07:31 crc kubenswrapper[4952]: I1122 03:07:31.033504 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nvvk8"] Nov 22 03:07:31 crc kubenswrapper[4952]: I1122 03:07:31.043561 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e09635f-3708-40c4-bae8-3012775e4997-catalog-content\") pod \"redhat-operators-nvvk8\" (UID: \"6e09635f-3708-40c4-bae8-3012775e4997\") " pod="openshift-marketplace/redhat-operators-nvvk8" Nov 22 03:07:31 crc kubenswrapper[4952]: I1122 03:07:31.043679 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e09635f-3708-40c4-bae8-3012775e4997-utilities\") pod \"redhat-operators-nvvk8\" (UID: \"6e09635f-3708-40c4-bae8-3012775e4997\") " pod="openshift-marketplace/redhat-operators-nvvk8" Nov 22 03:07:31 crc kubenswrapper[4952]: I1122 03:07:31.043731 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfvm2\" (UniqueName: \"kubernetes.io/projected/6e09635f-3708-40c4-bae8-3012775e4997-kube-api-access-mfvm2\") pod \"redhat-operators-nvvk8\" (UID: \"6e09635f-3708-40c4-bae8-3012775e4997\") " pod="openshift-marketplace/redhat-operators-nvvk8" Nov 22 03:07:31 crc kubenswrapper[4952]: I1122 03:07:31.145889 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e09635f-3708-40c4-bae8-3012775e4997-catalog-content\") pod \"redhat-operators-nvvk8\" (UID: \"6e09635f-3708-40c4-bae8-3012775e4997\") " pod="openshift-marketplace/redhat-operators-nvvk8" Nov 22 03:07:31 crc kubenswrapper[4952]: I1122 03:07:31.146067 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e09635f-3708-40c4-bae8-3012775e4997-utilities\") pod \"redhat-operators-nvvk8\" (UID: \"6e09635f-3708-40c4-bae8-3012775e4997\") " pod="openshift-marketplace/redhat-operators-nvvk8" Nov 22 03:07:31 crc kubenswrapper[4952]: I1122 03:07:31.146118 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfvm2\" (UniqueName: \"kubernetes.io/projected/6e09635f-3708-40c4-bae8-3012775e4997-kube-api-access-mfvm2\") pod \"redhat-operators-nvvk8\" (UID: \"6e09635f-3708-40c4-bae8-3012775e4997\") " pod="openshift-marketplace/redhat-operators-nvvk8" Nov 22 03:07:31 crc kubenswrapper[4952]: I1122 03:07:31.146720 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e09635f-3708-40c4-bae8-3012775e4997-catalog-content\") pod \"redhat-operators-nvvk8\" (UID: \"6e09635f-3708-40c4-bae8-3012775e4997\") " pod="openshift-marketplace/redhat-operators-nvvk8" Nov 22 03:07:31 crc kubenswrapper[4952]: I1122 03:07:31.146763 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e09635f-3708-40c4-bae8-3012775e4997-utilities\") pod \"redhat-operators-nvvk8\" (UID: \"6e09635f-3708-40c4-bae8-3012775e4997\") " pod="openshift-marketplace/redhat-operators-nvvk8" Nov 22 03:07:31 crc kubenswrapper[4952]: I1122 03:07:31.174896 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfvm2\" (UniqueName: \"kubernetes.io/projected/6e09635f-3708-40c4-bae8-3012775e4997-kube-api-access-mfvm2\") pod \"redhat-operators-nvvk8\" (UID: \"6e09635f-3708-40c4-bae8-3012775e4997\") " pod="openshift-marketplace/redhat-operators-nvvk8" Nov 22 03:07:31 crc kubenswrapper[4952]: I1122 03:07:31.364139 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nvvk8" Nov 22 03:07:31 crc kubenswrapper[4952]: I1122 03:07:31.650422 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nvvk8"] Nov 22 03:07:31 crc kubenswrapper[4952]: W1122 03:07:31.658756 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e09635f_3708_40c4_bae8_3012775e4997.slice/crio-dd97a115fe23f677f48ae8d5daab3a64d7241c9bd3a643b7dc7f0c76305f4632 WatchSource:0}: Error finding container dd97a115fe23f677f48ae8d5daab3a64d7241c9bd3a643b7dc7f0c76305f4632: Status 404 returned error can't find the container with id dd97a115fe23f677f48ae8d5daab3a64d7241c9bd3a643b7dc7f0c76305f4632 Nov 22 03:07:31 crc kubenswrapper[4952]: I1122 03:07:31.879632 4952 generic.go:334] "Generic (PLEG): container finished" podID="6e09635f-3708-40c4-bae8-3012775e4997" containerID="521b721d3a8d70b69093817d5b72c8d6149991c8d1a885ecb6085b5cf5813d8f" exitCode=0 Nov 22 03:07:31 crc kubenswrapper[4952]: I1122 03:07:31.879699 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvvk8" event={"ID":"6e09635f-3708-40c4-bae8-3012775e4997","Type":"ContainerDied","Data":"521b721d3a8d70b69093817d5b72c8d6149991c8d1a885ecb6085b5cf5813d8f"} Nov 22 03:07:31 crc kubenswrapper[4952]: I1122 03:07:31.880247 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvvk8" event={"ID":"6e09635f-3708-40c4-bae8-3012775e4997","Type":"ContainerStarted","Data":"dd97a115fe23f677f48ae8d5daab3a64d7241c9bd3a643b7dc7f0c76305f4632"} Nov 22 03:07:32 crc kubenswrapper[4952]: I1122 03:07:32.888890 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvvk8" event={"ID":"6e09635f-3708-40c4-bae8-3012775e4997","Type":"ContainerStarted","Data":"558e5459fbc61e65dde7f7b52cb2a29448473ec5170bf3e80a2f9cc4a1364851"} Nov 22 03:07:33 crc kubenswrapper[4952]: I1122 03:07:33.902171 4952 generic.go:334] "Generic (PLEG): container finished" podID="6e09635f-3708-40c4-bae8-3012775e4997" containerID="558e5459fbc61e65dde7f7b52cb2a29448473ec5170bf3e80a2f9cc4a1364851" exitCode=0 Nov 22 03:07:33 crc kubenswrapper[4952]: I1122 03:07:33.902258 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvvk8" event={"ID":"6e09635f-3708-40c4-bae8-3012775e4997","Type":"ContainerDied","Data":"558e5459fbc61e65dde7f7b52cb2a29448473ec5170bf3e80a2f9cc4a1364851"} Nov 22 03:07:34 crc kubenswrapper[4952]: I1122 03:07:34.953732 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wbw6s" Nov 22 03:07:34 crc kubenswrapper[4952]: I1122 03:07:34.955179 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wbw6s" Nov 22 03:07:35 crc kubenswrapper[4952]: I1122 03:07:35.025743 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wbw6s" Nov 22 03:07:35 crc kubenswrapper[4952]: I1122 03:07:35.159165 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pbzxf" Nov 22 03:07:35 crc kubenswrapper[4952]: I1122 03:07:35.159260 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pbzxf" Nov 22 03:07:35 crc kubenswrapper[4952]: I1122 03:07:35.221697 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pbzxf" Nov 22 03:07:35 crc kubenswrapper[4952]: I1122 03:07:35.967991 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pbzxf" Nov 22 03:07:35 crc kubenswrapper[4952]: I1122 03:07:35.985155 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wbw6s" Nov 22 03:07:36 crc kubenswrapper[4952]: I1122 03:07:36.930367 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvvk8" event={"ID":"6e09635f-3708-40c4-bae8-3012775e4997","Type":"ContainerStarted","Data":"34cc435d42c5c4c487fe2107c8df8a0fbe6f121005c08ab665738b709fa5b39b"} Nov 22 03:07:36 crc kubenswrapper[4952]: I1122 03:07:36.953830 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nvvk8" podStartSLOduration=2.260591888 podStartE2EDuration="6.953807002s" podCreationTimestamp="2025-11-22 03:07:30 +0000 UTC" firstStartedPulling="2025-11-22 03:07:31.881780575 +0000 UTC m=+816.187797868" lastFinishedPulling="2025-11-22 03:07:36.574995689 +0000 UTC m=+820.881012982" observedRunningTime="2025-11-22 03:07:36.953681919 +0000 UTC m=+821.259699212" watchObservedRunningTime="2025-11-22 03:07:36.953807002 +0000 UTC m=+821.259824285" Nov 22 03:07:37 crc kubenswrapper[4952]: I1122 03:07:37.011457 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wbw6s"] Nov 22 03:07:37 crc kubenswrapper[4952]: I1122 03:07:37.611597 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pbzxf"] Nov 22 03:07:37 crc kubenswrapper[4952]: I1122 03:07:37.946967 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pbzxf" podUID="c1b715d4-384c-40b3-a393-b880ae580966" containerName="registry-server" containerID="cri-o://33a8dfd5ec2ae59a45aaf811edf60742bf76a9545c19fd462c5a24f901a45d87" gracePeriod=2 Nov 22 03:07:38 crc kubenswrapper[4952]: I1122 03:07:38.955517 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wbw6s" podUID="cae57b38-9d8a-4c42-93c6-4811d0ffe144" containerName="registry-server" containerID="cri-o://9357d48d6f2d26e5d6c7c387a102ab01098d8cd2913910d6fb107fe7a0d568e6" gracePeriod=2 Nov 22 03:07:40 crc kubenswrapper[4952]: I1122 03:07:40.997143 4952 generic.go:334] "Generic (PLEG): container finished" podID="c1b715d4-384c-40b3-a393-b880ae580966" containerID="33a8dfd5ec2ae59a45aaf811edf60742bf76a9545c19fd462c5a24f901a45d87" exitCode=0 Nov 22 03:07:40 crc kubenswrapper[4952]: I1122 03:07:40.997287 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbzxf" event={"ID":"c1b715d4-384c-40b3-a393-b880ae580966","Type":"ContainerDied","Data":"33a8dfd5ec2ae59a45aaf811edf60742bf76a9545c19fd462c5a24f901a45d87"} Nov 22 03:07:41 crc kubenswrapper[4952]: I1122 03:07:41.365222 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nvvk8" Nov 22 03:07:41 crc kubenswrapper[4952]: I1122 03:07:41.365316 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nvvk8" Nov 22 03:07:42 crc kubenswrapper[4952]: I1122 03:07:42.437567 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nvvk8" podUID="6e09635f-3708-40c4-bae8-3012775e4997" containerName="registry-server" probeResult="failure" output=< Nov 22 03:07:42 crc kubenswrapper[4952]: timeout: failed to connect service ":50051" within 1s Nov 22 03:07:42 crc kubenswrapper[4952]: > Nov 22 03:07:42 crc kubenswrapper[4952]: I1122 03:07:42.473923 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pbzxf" Nov 22 03:07:42 crc kubenswrapper[4952]: I1122 03:07:42.631104 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1b715d4-384c-40b3-a393-b880ae580966-catalog-content\") pod \"c1b715d4-384c-40b3-a393-b880ae580966\" (UID: \"c1b715d4-384c-40b3-a393-b880ae580966\") " Nov 22 03:07:42 crc kubenswrapper[4952]: I1122 03:07:42.631242 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1b715d4-384c-40b3-a393-b880ae580966-utilities\") pod \"c1b715d4-384c-40b3-a393-b880ae580966\" (UID: \"c1b715d4-384c-40b3-a393-b880ae580966\") " Nov 22 03:07:42 crc kubenswrapper[4952]: I1122 03:07:42.631286 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k5kg\" (UniqueName: \"kubernetes.io/projected/c1b715d4-384c-40b3-a393-b880ae580966-kube-api-access-7k5kg\") pod \"c1b715d4-384c-40b3-a393-b880ae580966\" (UID: \"c1b715d4-384c-40b3-a393-b880ae580966\") " Nov 22 03:07:42 crc kubenswrapper[4952]: I1122 03:07:42.632354 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1b715d4-384c-40b3-a393-b880ae580966-utilities" (OuterVolumeSpecName: "utilities") pod "c1b715d4-384c-40b3-a393-b880ae580966" (UID: "c1b715d4-384c-40b3-a393-b880ae580966"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:07:42 crc kubenswrapper[4952]: I1122 03:07:42.638740 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1b715d4-384c-40b3-a393-b880ae580966-kube-api-access-7k5kg" (OuterVolumeSpecName: "kube-api-access-7k5kg") pod "c1b715d4-384c-40b3-a393-b880ae580966" (UID: "c1b715d4-384c-40b3-a393-b880ae580966"). InnerVolumeSpecName "kube-api-access-7k5kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:07:42 crc kubenswrapper[4952]: I1122 03:07:42.651625 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1b715d4-384c-40b3-a393-b880ae580966-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1b715d4-384c-40b3-a393-b880ae580966" (UID: "c1b715d4-384c-40b3-a393-b880ae580966"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:07:42 crc kubenswrapper[4952]: I1122 03:07:42.733183 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1b715d4-384c-40b3-a393-b880ae580966-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:42 crc kubenswrapper[4952]: I1122 03:07:42.734327 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1b715d4-384c-40b3-a393-b880ae580966-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:42 crc kubenswrapper[4952]: I1122 03:07:42.734349 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k5kg\" (UniqueName: \"kubernetes.io/projected/c1b715d4-384c-40b3-a393-b880ae580966-kube-api-access-7k5kg\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:43 crc kubenswrapper[4952]: I1122 03:07:43.021830 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbzxf" event={"ID":"c1b715d4-384c-40b3-a393-b880ae580966","Type":"ContainerDied","Data":"63aa574447f9cd27a035a6218c7451076b98e4c88130657ed4b87bac84434623"} Nov 22 03:07:43 crc kubenswrapper[4952]: I1122 03:07:43.022372 4952 scope.go:117] "RemoveContainer" containerID="33a8dfd5ec2ae59a45aaf811edf60742bf76a9545c19fd462c5a24f901a45d87" Nov 22 03:07:43 crc kubenswrapper[4952]: I1122 03:07:43.021881 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pbzxf" Nov 22 03:07:43 crc kubenswrapper[4952]: I1122 03:07:43.028290 4952 generic.go:334] "Generic (PLEG): container finished" podID="cae57b38-9d8a-4c42-93c6-4811d0ffe144" containerID="9357d48d6f2d26e5d6c7c387a102ab01098d8cd2913910d6fb107fe7a0d568e6" exitCode=0 Nov 22 03:07:43 crc kubenswrapper[4952]: I1122 03:07:43.028338 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbw6s" event={"ID":"cae57b38-9d8a-4c42-93c6-4811d0ffe144","Type":"ContainerDied","Data":"9357d48d6f2d26e5d6c7c387a102ab01098d8cd2913910d6fb107fe7a0d568e6"} Nov 22 03:07:43 crc kubenswrapper[4952]: I1122 03:07:43.061278 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pbzxf"] Nov 22 03:07:43 crc kubenswrapper[4952]: I1122 03:07:43.061808 4952 scope.go:117] "RemoveContainer" containerID="c895ba1bd6b2259bbeb492d5c353b2711e969e64d4ae100ea991b8a91ad0744e" Nov 22 03:07:43 crc kubenswrapper[4952]: I1122 03:07:43.067942 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pbzxf"] Nov 22 03:07:43 crc kubenswrapper[4952]: I1122 03:07:43.124155 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbw6s" Nov 22 03:07:43 crc kubenswrapper[4952]: I1122 03:07:43.128228 4952 scope.go:117] "RemoveContainer" containerID="de71cf01e1180f2c7e410e40518ffd555fedd7c0615418ceb45e9b6b27c696d5" Nov 22 03:07:43 crc kubenswrapper[4952]: I1122 03:07:43.240051 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cae57b38-9d8a-4c42-93c6-4811d0ffe144-catalog-content\") pod \"cae57b38-9d8a-4c42-93c6-4811d0ffe144\" (UID: \"cae57b38-9d8a-4c42-93c6-4811d0ffe144\") " Nov 22 03:07:43 crc kubenswrapper[4952]: I1122 03:07:43.240118 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjd4w\" (UniqueName: \"kubernetes.io/projected/cae57b38-9d8a-4c42-93c6-4811d0ffe144-kube-api-access-sjd4w\") pod \"cae57b38-9d8a-4c42-93c6-4811d0ffe144\" (UID: \"cae57b38-9d8a-4c42-93c6-4811d0ffe144\") " Nov 22 03:07:43 crc kubenswrapper[4952]: I1122 03:07:43.240346 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cae57b38-9d8a-4c42-93c6-4811d0ffe144-utilities\") pod \"cae57b38-9d8a-4c42-93c6-4811d0ffe144\" (UID: \"cae57b38-9d8a-4c42-93c6-4811d0ffe144\") " Nov 22 03:07:43 crc kubenswrapper[4952]: I1122 03:07:43.241426 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cae57b38-9d8a-4c42-93c6-4811d0ffe144-utilities" (OuterVolumeSpecName: "utilities") pod "cae57b38-9d8a-4c42-93c6-4811d0ffe144" (UID: "cae57b38-9d8a-4c42-93c6-4811d0ffe144"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:07:43 crc kubenswrapper[4952]: I1122 03:07:43.243834 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae57b38-9d8a-4c42-93c6-4811d0ffe144-kube-api-access-sjd4w" (OuterVolumeSpecName: "kube-api-access-sjd4w") pod "cae57b38-9d8a-4c42-93c6-4811d0ffe144" (UID: "cae57b38-9d8a-4c42-93c6-4811d0ffe144"). InnerVolumeSpecName "kube-api-access-sjd4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:07:43 crc kubenswrapper[4952]: I1122 03:07:43.278878 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cae57b38-9d8a-4c42-93c6-4811d0ffe144-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cae57b38-9d8a-4c42-93c6-4811d0ffe144" (UID: "cae57b38-9d8a-4c42-93c6-4811d0ffe144"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:07:43 crc kubenswrapper[4952]: I1122 03:07:43.341817 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cae57b38-9d8a-4c42-93c6-4811d0ffe144-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:43 crc kubenswrapper[4952]: I1122 03:07:43.342133 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cae57b38-9d8a-4c42-93c6-4811d0ffe144-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:43 crc kubenswrapper[4952]: I1122 03:07:43.342217 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjd4w\" (UniqueName: \"kubernetes.io/projected/cae57b38-9d8a-4c42-93c6-4811d0ffe144-kube-api-access-sjd4w\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:44 crc kubenswrapper[4952]: I1122 03:07:44.045949 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbw6s" event={"ID":"cae57b38-9d8a-4c42-93c6-4811d0ffe144","Type":"ContainerDied","Data":"ab20e65fc5ab2f03e52f5803f66dfff37eca67aa4e34158750d9b5b0cac78134"} Nov 22 03:07:44 crc kubenswrapper[4952]: I1122 03:07:44.046749 4952 scope.go:117] "RemoveContainer" containerID="9357d48d6f2d26e5d6c7c387a102ab01098d8cd2913910d6fb107fe7a0d568e6" Nov 22 03:07:44 crc kubenswrapper[4952]: I1122 03:07:44.046116 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbw6s" Nov 22 03:07:44 crc kubenswrapper[4952]: I1122 03:07:44.080019 4952 scope.go:117] "RemoveContainer" containerID="68c175b5dd1047e4fa256655a03cf31b8238929a3c2fa7ac1d8c80e511e2eca8" Nov 22 03:07:44 crc kubenswrapper[4952]: I1122 03:07:44.101213 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wbw6s"] Nov 22 03:07:44 crc kubenswrapper[4952]: I1122 03:07:44.109861 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wbw6s"] Nov 22 03:07:44 crc kubenswrapper[4952]: I1122 03:07:44.120256 4952 scope.go:117] "RemoveContainer" containerID="d3d3862a1f95a0519ffbc84e51265036319dea4e77c041285903fcf98750460d" Nov 22 03:07:44 crc kubenswrapper[4952]: I1122 03:07:44.555602 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1b715d4-384c-40b3-a393-b880ae580966" path="/var/lib/kubelet/pods/c1b715d4-384c-40b3-a393-b880ae580966/volumes" Nov 22 03:07:44 crc kubenswrapper[4952]: I1122 03:07:44.557684 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae57b38-9d8a-4c42-93c6-4811d0ffe144" path="/var/lib/kubelet/pods/cae57b38-9d8a-4c42-93c6-4811d0ffe144/volumes" Nov 22 03:07:51 crc kubenswrapper[4952]: I1122 03:07:51.443346 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nvvk8" Nov 22 03:07:51 crc kubenswrapper[4952]: I1122 03:07:51.492075 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nvvk8" Nov 22 03:07:51 crc kubenswrapper[4952]: I1122 03:07:51.699493 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nvvk8"] Nov 22 03:07:53 crc kubenswrapper[4952]: I1122 03:07:53.118992 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nvvk8" podUID="6e09635f-3708-40c4-bae8-3012775e4997" containerName="registry-server" containerID="cri-o://34cc435d42c5c4c487fe2107c8df8a0fbe6f121005c08ab665738b709fa5b39b" gracePeriod=2 Nov 22 03:07:53 crc kubenswrapper[4952]: I1122 03:07:53.540645 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nvvk8" Nov 22 03:07:53 crc kubenswrapper[4952]: I1122 03:07:53.702049 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e09635f-3708-40c4-bae8-3012775e4997-utilities\") pod \"6e09635f-3708-40c4-bae8-3012775e4997\" (UID: \"6e09635f-3708-40c4-bae8-3012775e4997\") " Nov 22 03:07:53 crc kubenswrapper[4952]: I1122 03:07:53.702144 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfvm2\" (UniqueName: \"kubernetes.io/projected/6e09635f-3708-40c4-bae8-3012775e4997-kube-api-access-mfvm2\") pod \"6e09635f-3708-40c4-bae8-3012775e4997\" (UID: \"6e09635f-3708-40c4-bae8-3012775e4997\") " Nov 22 03:07:53 crc kubenswrapper[4952]: I1122 03:07:53.702170 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e09635f-3708-40c4-bae8-3012775e4997-catalog-content\") pod \"6e09635f-3708-40c4-bae8-3012775e4997\" (UID: \"6e09635f-3708-40c4-bae8-3012775e4997\") " Nov 22 03:07:53 crc kubenswrapper[4952]: I1122 03:07:53.702912 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e09635f-3708-40c4-bae8-3012775e4997-utilities" (OuterVolumeSpecName: "utilities") pod "6e09635f-3708-40c4-bae8-3012775e4997" (UID: "6e09635f-3708-40c4-bae8-3012775e4997"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:07:53 crc kubenswrapper[4952]: I1122 03:07:53.710465 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e09635f-3708-40c4-bae8-3012775e4997-kube-api-access-mfvm2" (OuterVolumeSpecName: "kube-api-access-mfvm2") pod "6e09635f-3708-40c4-bae8-3012775e4997" (UID: "6e09635f-3708-40c4-bae8-3012775e4997"). InnerVolumeSpecName "kube-api-access-mfvm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:07:53 crc kubenswrapper[4952]: I1122 03:07:53.800107 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e09635f-3708-40c4-bae8-3012775e4997-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e09635f-3708-40c4-bae8-3012775e4997" (UID: "6e09635f-3708-40c4-bae8-3012775e4997"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:07:53 crc kubenswrapper[4952]: I1122 03:07:53.803815 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e09635f-3708-40c4-bae8-3012775e4997-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:53 crc kubenswrapper[4952]: I1122 03:07:53.803851 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfvm2\" (UniqueName: \"kubernetes.io/projected/6e09635f-3708-40c4-bae8-3012775e4997-kube-api-access-mfvm2\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:53 crc kubenswrapper[4952]: I1122 03:07:53.803868 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e09635f-3708-40c4-bae8-3012775e4997-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:54 crc kubenswrapper[4952]: I1122 03:07:54.127439 4952 generic.go:334] "Generic (PLEG): container finished" podID="6e09635f-3708-40c4-bae8-3012775e4997" containerID="34cc435d42c5c4c487fe2107c8df8a0fbe6f121005c08ab665738b709fa5b39b" exitCode=0 Nov 22 03:07:54 crc kubenswrapper[4952]: I1122 03:07:54.127491 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvvk8" event={"ID":"6e09635f-3708-40c4-bae8-3012775e4997","Type":"ContainerDied","Data":"34cc435d42c5c4c487fe2107c8df8a0fbe6f121005c08ab665738b709fa5b39b"} Nov 22 03:07:54 crc kubenswrapper[4952]: I1122 03:07:54.127524 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvvk8" event={"ID":"6e09635f-3708-40c4-bae8-3012775e4997","Type":"ContainerDied","Data":"dd97a115fe23f677f48ae8d5daab3a64d7241c9bd3a643b7dc7f0c76305f4632"} Nov 22 03:07:54 crc kubenswrapper[4952]: I1122 03:07:54.127562 4952 scope.go:117] "RemoveContainer" containerID="34cc435d42c5c4c487fe2107c8df8a0fbe6f121005c08ab665738b709fa5b39b" Nov 22 03:07:54 crc kubenswrapper[4952]: I1122 03:07:54.127685 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nvvk8" Nov 22 03:07:54 crc kubenswrapper[4952]: I1122 03:07:54.150452 4952 scope.go:117] "RemoveContainer" containerID="558e5459fbc61e65dde7f7b52cb2a29448473ec5170bf3e80a2f9cc4a1364851" Nov 22 03:07:54 crc kubenswrapper[4952]: I1122 03:07:54.157752 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nvvk8"] Nov 22 03:07:54 crc kubenswrapper[4952]: I1122 03:07:54.167124 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nvvk8"] Nov 22 03:07:54 crc kubenswrapper[4952]: I1122 03:07:54.182465 4952 scope.go:117] "RemoveContainer" containerID="521b721d3a8d70b69093817d5b72c8d6149991c8d1a885ecb6085b5cf5813d8f" Nov 22 03:07:54 crc kubenswrapper[4952]: I1122 03:07:54.198171 4952 scope.go:117] "RemoveContainer" containerID="34cc435d42c5c4c487fe2107c8df8a0fbe6f121005c08ab665738b709fa5b39b" Nov 22 03:07:54 crc kubenswrapper[4952]: E1122 03:07:54.198767 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34cc435d42c5c4c487fe2107c8df8a0fbe6f121005c08ab665738b709fa5b39b\": container with ID starting with 34cc435d42c5c4c487fe2107c8df8a0fbe6f121005c08ab665738b709fa5b39b not found: ID does not exist" containerID="34cc435d42c5c4c487fe2107c8df8a0fbe6f121005c08ab665738b709fa5b39b" Nov 22 03:07:54 crc kubenswrapper[4952]: I1122 03:07:54.198820 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34cc435d42c5c4c487fe2107c8df8a0fbe6f121005c08ab665738b709fa5b39b"} err="failed to get container status \"34cc435d42c5c4c487fe2107c8df8a0fbe6f121005c08ab665738b709fa5b39b\": rpc error: code = NotFound desc = could not find container \"34cc435d42c5c4c487fe2107c8df8a0fbe6f121005c08ab665738b709fa5b39b\": container with ID starting with 34cc435d42c5c4c487fe2107c8df8a0fbe6f121005c08ab665738b709fa5b39b not found: ID does not exist" Nov 22 03:07:54 crc kubenswrapper[4952]: I1122 03:07:54.198856 4952 scope.go:117] "RemoveContainer" containerID="558e5459fbc61e65dde7f7b52cb2a29448473ec5170bf3e80a2f9cc4a1364851" Nov 22 03:07:54 crc kubenswrapper[4952]: E1122 03:07:54.199286 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"558e5459fbc61e65dde7f7b52cb2a29448473ec5170bf3e80a2f9cc4a1364851\": container with ID starting with 558e5459fbc61e65dde7f7b52cb2a29448473ec5170bf3e80a2f9cc4a1364851 not found: ID does not exist" containerID="558e5459fbc61e65dde7f7b52cb2a29448473ec5170bf3e80a2f9cc4a1364851" Nov 22 03:07:54 crc kubenswrapper[4952]: I1122 03:07:54.199351 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"558e5459fbc61e65dde7f7b52cb2a29448473ec5170bf3e80a2f9cc4a1364851"} err="failed to get container status \"558e5459fbc61e65dde7f7b52cb2a29448473ec5170bf3e80a2f9cc4a1364851\": rpc error: code = NotFound desc = could not find container \"558e5459fbc61e65dde7f7b52cb2a29448473ec5170bf3e80a2f9cc4a1364851\": container with ID starting with 558e5459fbc61e65dde7f7b52cb2a29448473ec5170bf3e80a2f9cc4a1364851 not found: ID does not exist" Nov 22 03:07:54 crc kubenswrapper[4952]: I1122 03:07:54.199399 4952 scope.go:117] "RemoveContainer" containerID="521b721d3a8d70b69093817d5b72c8d6149991c8d1a885ecb6085b5cf5813d8f" Nov 22 03:07:54 crc kubenswrapper[4952]: E1122 03:07:54.199835 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"521b721d3a8d70b69093817d5b72c8d6149991c8d1a885ecb6085b5cf5813d8f\": container with ID starting with 521b721d3a8d70b69093817d5b72c8d6149991c8d1a885ecb6085b5cf5813d8f not found: ID does not exist" containerID="521b721d3a8d70b69093817d5b72c8d6149991c8d1a885ecb6085b5cf5813d8f" Nov 22 03:07:54 crc kubenswrapper[4952]: I1122 03:07:54.199912 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"521b721d3a8d70b69093817d5b72c8d6149991c8d1a885ecb6085b5cf5813d8f"} err="failed to get container status \"521b721d3a8d70b69093817d5b72c8d6149991c8d1a885ecb6085b5cf5813d8f\": rpc error: code = NotFound desc = could not find container \"521b721d3a8d70b69093817d5b72c8d6149991c8d1a885ecb6085b5cf5813d8f\": container with ID starting with 521b721d3a8d70b69093817d5b72c8d6149991c8d1a885ecb6085b5cf5813d8f not found: ID does not exist" Nov 22 03:07:54 crc kubenswrapper[4952]: I1122 03:07:54.540528 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e09635f-3708-40c4-bae8-3012775e4997" path="/var/lib/kubelet/pods/6e09635f-3708-40c4-bae8-3012775e4997/volumes" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.662364 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75fb479bcc-j9bj9"] Nov 22 03:08:02 crc kubenswrapper[4952]: E1122 03:08:02.663748 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b715d4-384c-40b3-a393-b880ae580966" containerName="extract-content" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.663768 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b715d4-384c-40b3-a393-b880ae580966" containerName="extract-content" Nov 22 03:08:02 crc kubenswrapper[4952]: E1122 03:08:02.663782 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae57b38-9d8a-4c42-93c6-4811d0ffe144" containerName="extract-content" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.663792 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae57b38-9d8a-4c42-93c6-4811d0ffe144" containerName="extract-content" Nov 22 03:08:02 crc kubenswrapper[4952]: E1122 03:08:02.663807 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e09635f-3708-40c4-bae8-3012775e4997" containerName="registry-server" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.663817 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e09635f-3708-40c4-bae8-3012775e4997" containerName="registry-server" Nov 22 03:08:02 crc kubenswrapper[4952]: E1122 03:08:02.663835 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b715d4-384c-40b3-a393-b880ae580966" containerName="registry-server" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.663844 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b715d4-384c-40b3-a393-b880ae580966" containerName="registry-server" Nov 22 03:08:02 crc kubenswrapper[4952]: E1122 03:08:02.663860 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b715d4-384c-40b3-a393-b880ae580966" containerName="extract-utilities" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.663871 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b715d4-384c-40b3-a393-b880ae580966" containerName="extract-utilities" Nov 22 03:08:02 crc kubenswrapper[4952]: E1122 03:08:02.663893 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e09635f-3708-40c4-bae8-3012775e4997" containerName="extract-content" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.663905 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e09635f-3708-40c4-bae8-3012775e4997" containerName="extract-content" Nov 22 03:08:02 crc kubenswrapper[4952]: E1122 03:08:02.663920 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae57b38-9d8a-4c42-93c6-4811d0ffe144" containerName="extract-utilities" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.663929 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae57b38-9d8a-4c42-93c6-4811d0ffe144" containerName="extract-utilities" Nov 22 03:08:02 crc kubenswrapper[4952]: E1122 03:08:02.663939 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae57b38-9d8a-4c42-93c6-4811d0ffe144" containerName="registry-server" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.663947 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae57b38-9d8a-4c42-93c6-4811d0ffe144" containerName="registry-server" Nov 22 03:08:02 crc kubenswrapper[4952]: E1122 03:08:02.663960 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e09635f-3708-40c4-bae8-3012775e4997" containerName="extract-utilities" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.663968 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e09635f-3708-40c4-bae8-3012775e4997" containerName="extract-utilities" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.664115 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1b715d4-384c-40b3-a393-b880ae580966" containerName="registry-server" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.664130 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae57b38-9d8a-4c42-93c6-4811d0ffe144" containerName="registry-server" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.664151 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e09635f-3708-40c4-bae8-3012775e4997" containerName="registry-server" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.665059 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-j9bj9" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.667458 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-fdcbbd9b5-w9lwq"] Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.668035 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-r9z22" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.671865 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-fdcbbd9b5-w9lwq" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.674132 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-drtp8" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.678045 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75fb479bcc-j9bj9"] Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.691686 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-767ccfd65f-t2mlk"] Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.692945 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-t2mlk" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.698653 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-fdcbbd9b5-w9lwq"] Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.698868 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-n66zk" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.733205 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-767ccfd65f-t2mlk"] Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.738233 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prb2g\" (UniqueName: \"kubernetes.io/projected/3166414a-5d0f-460b-81cd-a8cfab489ff5-kube-api-access-prb2g\") pod \"barbican-operator-controller-manager-75fb479bcc-j9bj9\" (UID: \"3166414a-5d0f-460b-81cd-a8cfab489ff5\") " pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-j9bj9" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.748779 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7969689c84-lhbkq"] Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.750189 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7969689c84-lhbkq" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.753024 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-mxqr5" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.757476 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-56f54d6746-29fxl"] Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.759104 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-29fxl" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.761367 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7969689c84-lhbkq"] Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.762515 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-vhp2p" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.768000 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-598f69df5d-str4f"] Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.769431 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-str4f" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.788687 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-5vx6l" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.801293 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7875d8bb94-2vq4f"] Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.802659 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-2vq4f" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.804798 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.805063 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-nbjsl" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.836895 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-598f69df5d-str4f"] Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.839942 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggn42\" (UniqueName: \"kubernetes.io/projected/2704d952-0ecf-42b9-9185-625d8c662a00-kube-api-access-ggn42\") pod \"cinder-operator-controller-manager-fdcbbd9b5-w9lwq\" (UID: \"2704d952-0ecf-42b9-9185-625d8c662a00\") " pod="openstack-operators/cinder-operator-controller-manager-fdcbbd9b5-w9lwq" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.840060 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prb2g\" (UniqueName: \"kubernetes.io/projected/3166414a-5d0f-460b-81cd-a8cfab489ff5-kube-api-access-prb2g\") pod \"barbican-operator-controller-manager-75fb479bcc-j9bj9\" (UID: \"3166414a-5d0f-460b-81cd-a8cfab489ff5\") " pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-j9bj9" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.840247 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tznsn\" (UniqueName: \"kubernetes.io/projected/dbb212fa-2d69-47c9-8ddc-3f1d78ce745a-kube-api-access-tznsn\") pod \"designate-operator-controller-manager-767ccfd65f-t2mlk\" (UID: \"dbb212fa-2d69-47c9-8ddc-3f1d78ce745a\") " pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-t2mlk" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.840284 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b-cert\") pod \"infra-operator-controller-manager-7875d8bb94-2vq4f\" (UID: \"e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b\") " pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-2vq4f" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.857640 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-56f54d6746-29fxl"] Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.876197 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-99b499f4-g7gzn"] Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.877443 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-g7gzn" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.881051 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-znzjl" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.894928 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prb2g\" (UniqueName: \"kubernetes.io/projected/3166414a-5d0f-460b-81cd-a8cfab489ff5-kube-api-access-prb2g\") pod \"barbican-operator-controller-manager-75fb479bcc-j9bj9\" (UID: \"3166414a-5d0f-460b-81cd-a8cfab489ff5\") " pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-j9bj9" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.914664 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7875d8bb94-2vq4f"] Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.937012 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-99b499f4-g7gzn"] Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.953360 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxq45\" (UniqueName: \"kubernetes.io/projected/0b8276cf-2a3d-40ad-83c6-fa522270b8a7-kube-api-access-jxq45\") pod \"heat-operator-controller-manager-56f54d6746-29fxl\" (UID: \"0b8276cf-2a3d-40ad-83c6-fa522270b8a7\") " pod="openstack-operators/heat-operator-controller-manager-56f54d6746-29fxl" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.955242 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f2br\" (UniqueName: \"kubernetes.io/projected/36ba8aa7-85ec-461d-a0d7-39f09c60289f-kube-api-access-8f2br\") pod \"horizon-operator-controller-manager-598f69df5d-str4f\" (UID: \"36ba8aa7-85ec-461d-a0d7-39f09c60289f\") " pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-str4f" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.955628 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlvjp\" (UniqueName: \"kubernetes.io/projected/e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b-kube-api-access-mlvjp\") pod \"infra-operator-controller-manager-7875d8bb94-2vq4f\" (UID: \"e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b\") " pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-2vq4f" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.962630 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tznsn\" (UniqueName: \"kubernetes.io/projected/dbb212fa-2d69-47c9-8ddc-3f1d78ce745a-kube-api-access-tznsn\") pod \"designate-operator-controller-manager-767ccfd65f-t2mlk\" (UID: \"dbb212fa-2d69-47c9-8ddc-3f1d78ce745a\") " pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-t2mlk" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.962705 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b-cert\") pod \"infra-operator-controller-manager-7875d8bb94-2vq4f\" (UID: \"e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b\") " pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-2vq4f" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.962770 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggn42\" (UniqueName: \"kubernetes.io/projected/2704d952-0ecf-42b9-9185-625d8c662a00-kube-api-access-ggn42\") pod \"cinder-operator-controller-manager-fdcbbd9b5-w9lwq\" (UID: \"2704d952-0ecf-42b9-9185-625d8c662a00\") " pod="openstack-operators/cinder-operator-controller-manager-fdcbbd9b5-w9lwq" Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.962841 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4bpg\" (UniqueName: \"kubernetes.io/projected/996261d7-26a3-41f8-9531-73c3ec296c1d-kube-api-access-f4bpg\") pod \"glance-operator-controller-manager-7969689c84-lhbkq\" (UID: \"996261d7-26a3-41f8-9531-73c3ec296c1d\") " pod="openstack-operators/glance-operator-controller-manager-7969689c84-lhbkq" Nov 22 03:08:02 crc kubenswrapper[4952]: E1122 03:08:02.963616 4952 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 22 03:08:02 crc kubenswrapper[4952]: E1122 03:08:02.963689 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b-cert podName:e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b nodeName:}" failed. No retries permitted until 2025-11-22 03:08:03.463670014 +0000 UTC m=+847.769687287 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b-cert") pod "infra-operator-controller-manager-7875d8bb94-2vq4f" (UID: "e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b") : secret "infra-operator-webhook-server-cert" not found Nov 22 03:08:02 crc kubenswrapper[4952]: I1122 03:08:02.999088 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-j9bj9" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.011817 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggn42\" (UniqueName: \"kubernetes.io/projected/2704d952-0ecf-42b9-9185-625d8c662a00-kube-api-access-ggn42\") pod \"cinder-operator-controller-manager-fdcbbd9b5-w9lwq\" (UID: \"2704d952-0ecf-42b9-9185-625d8c662a00\") " pod="openstack-operators/cinder-operator-controller-manager-fdcbbd9b5-w9lwq" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.013207 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7454b96578-8d2db"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.022394 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-8d2db" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.022727 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-fdcbbd9b5-w9lwq" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.023996 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tznsn\" (UniqueName: \"kubernetes.io/projected/dbb212fa-2d69-47c9-8ddc-3f1d78ce745a-kube-api-access-tznsn\") pod \"designate-operator-controller-manager-767ccfd65f-t2mlk\" (UID: \"dbb212fa-2d69-47c9-8ddc-3f1d78ce745a\") " pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-t2mlk" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.027106 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-t2mlk" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.027513 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-tnzdb" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.060094 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7454b96578-8d2db"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.067634 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxq45\" (UniqueName: \"kubernetes.io/projected/0b8276cf-2a3d-40ad-83c6-fa522270b8a7-kube-api-access-jxq45\") pod \"heat-operator-controller-manager-56f54d6746-29fxl\" (UID: \"0b8276cf-2a3d-40ad-83c6-fa522270b8a7\") " pod="openstack-operators/heat-operator-controller-manager-56f54d6746-29fxl" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.067681 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f2br\" (UniqueName: \"kubernetes.io/projected/36ba8aa7-85ec-461d-a0d7-39f09c60289f-kube-api-access-8f2br\") pod \"horizon-operator-controller-manager-598f69df5d-str4f\" (UID: \"36ba8aa7-85ec-461d-a0d7-39f09c60289f\") " pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-str4f" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.067710 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l9dg\" (UniqueName: \"kubernetes.io/projected/3de98aa0-3a15-4c46-adf3-d1715ccf5274-kube-api-access-6l9dg\") pod \"ironic-operator-controller-manager-99b499f4-g7gzn\" (UID: \"3de98aa0-3a15-4c46-adf3-d1715ccf5274\") " pod="openstack-operators/ironic-operator-controller-manager-99b499f4-g7gzn" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.067760 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlvjp\" (UniqueName: \"kubernetes.io/projected/e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b-kube-api-access-mlvjp\") pod \"infra-operator-controller-manager-7875d8bb94-2vq4f\" (UID: \"e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b\") " pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-2vq4f" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.067828 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4bpg\" (UniqueName: \"kubernetes.io/projected/996261d7-26a3-41f8-9531-73c3ec296c1d-kube-api-access-f4bpg\") pod \"glance-operator-controller-manager-7969689c84-lhbkq\" (UID: \"996261d7-26a3-41f8-9531-73c3ec296c1d\") " pod="openstack-operators/glance-operator-controller-manager-7969689c84-lhbkq" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.080747 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-58f887965d-dkhsn"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.082288 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-58f887965d-dkhsn" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.093840 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-54b5986bb8-gr8vb"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.095301 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-gr8vb" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.104911 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-8nvlr" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.118589 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-58f887965d-dkhsn"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.119098 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlvjp\" (UniqueName: \"kubernetes.io/projected/e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b-kube-api-access-mlvjp\") pod \"infra-operator-controller-manager-7875d8bb94-2vq4f\" (UID: \"e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b\") " pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-2vq4f" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.119356 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-28pq5" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.124310 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxq45\" (UniqueName: \"kubernetes.io/projected/0b8276cf-2a3d-40ad-83c6-fa522270b8a7-kube-api-access-jxq45\") pod \"heat-operator-controller-manager-56f54d6746-29fxl\" (UID: \"0b8276cf-2a3d-40ad-83c6-fa522270b8a7\") " pod="openstack-operators/heat-operator-controller-manager-56f54d6746-29fxl" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.125970 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f2br\" (UniqueName: \"kubernetes.io/projected/36ba8aa7-85ec-461d-a0d7-39f09c60289f-kube-api-access-8f2br\") pod \"horizon-operator-controller-manager-598f69df5d-str4f\" (UID: \"36ba8aa7-85ec-461d-a0d7-39f09c60289f\") " pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-str4f" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.128531 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-54b5986bb8-gr8vb"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.129821 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4bpg\" (UniqueName: \"kubernetes.io/projected/996261d7-26a3-41f8-9531-73c3ec296c1d-kube-api-access-f4bpg\") pod \"glance-operator-controller-manager-7969689c84-lhbkq\" (UID: \"996261d7-26a3-41f8-9531-73c3ec296c1d\") " pod="openstack-operators/glance-operator-controller-manager-7969689c84-lhbkq" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.134828 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-cfbb9c588-ztqdk"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.135024 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-str4f" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.138720 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-ztqdk" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.140706 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-qvfnr" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.173766 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78bd47f458-qw49c"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.178511 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l9dg\" (UniqueName: \"kubernetes.io/projected/3de98aa0-3a15-4c46-adf3-d1715ccf5274-kube-api-access-6l9dg\") pod \"ironic-operator-controller-manager-99b499f4-g7gzn\" (UID: \"3de98aa0-3a15-4c46-adf3-d1715ccf5274\") " pod="openstack-operators/ironic-operator-controller-manager-99b499f4-g7gzn" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.178798 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5fg7\" (UniqueName: \"kubernetes.io/projected/e8259b74-ce3d-4875-ac32-71e4397b4c01-kube-api-access-t5fg7\") pod \"keystone-operator-controller-manager-7454b96578-8d2db\" (UID: \"e8259b74-ce3d-4875-ac32-71e4397b4c01\") " pod="openstack-operators/keystone-operator-controller-manager-7454b96578-8d2db" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.180360 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-68bjm"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.180942 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-qw49c" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.181255 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-68bjm" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.186567 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78bd47f458-qw49c"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.187425 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-bdlns" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.190798 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-nq2r6" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.198378 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l9dg\" (UniqueName: \"kubernetes.io/projected/3de98aa0-3a15-4c46-adf3-d1715ccf5274-kube-api-access-6l9dg\") pod \"ironic-operator-controller-manager-99b499f4-g7gzn\" (UID: \"3de98aa0-3a15-4c46-adf3-d1715ccf5274\") " pod="openstack-operators/ironic-operator-controller-manager-99b499f4-g7gzn" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.198523 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-cfbb9c588-ztqdk"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.210431 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-68bjm"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.218015 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.219316 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.223055 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.223283 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-2zpxn" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.224080 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-54fc5f65b7-hfx5m"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.225753 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-hfx5m" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.229173 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-986dr" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.231120 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.241743 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-54fc5f65b7-hfx5m"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.262985 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b797b8dff-wnb2k"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.266517 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-wnb2k" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.276419 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-g7gzn" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.285712 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5fg7\" (UniqueName: \"kubernetes.io/projected/e8259b74-ce3d-4875-ac32-71e4397b4c01-kube-api-access-t5fg7\") pod \"keystone-operator-controller-manager-7454b96578-8d2db\" (UID: \"e8259b74-ce3d-4875-ac32-71e4397b4c01\") " pod="openstack-operators/keystone-operator-controller-manager-7454b96578-8d2db" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.285832 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgvqb\" (UniqueName: \"kubernetes.io/projected/887a5b13-80a9-4d9c-8b14-6768805ca936-kube-api-access-xgvqb\") pod \"neutron-operator-controller-manager-78bd47f458-qw49c\" (UID: \"887a5b13-80a9-4d9c-8b14-6768805ca936\") " pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-qw49c" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.285869 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8xt4\" (UniqueName: \"kubernetes.io/projected/032f6fc1-7bde-422b-bbe0-d83027f069d0-kube-api-access-x8xt4\") pod \"mariadb-operator-controller-manager-54b5986bb8-gr8vb\" (UID: \"032f6fc1-7bde-422b-bbe0-d83027f069d0\") " pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-gr8vb" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.285926 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc9dl\" (UniqueName: \"kubernetes.io/projected/6523048e-98da-4e64-9c79-7bbeda6ea361-kube-api-access-lc9dl\") pod \"manila-operator-controller-manager-58f887965d-dkhsn\" (UID: \"6523048e-98da-4e64-9c79-7bbeda6ea361\") " pod="openstack-operators/manila-operator-controller-manager-58f887965d-dkhsn" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.285983 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl85j\" (UniqueName: \"kubernetes.io/projected/c866a446-9629-4d59-8953-0599bda45549-kube-api-access-bl85j\") pod \"nova-operator-controller-manager-cfbb9c588-ztqdk\" (UID: \"c866a446-9629-4d59-8953-0599bda45549\") " pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-ztqdk" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.286086 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-d656998f4-d9k2w"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.289143 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-d656998f4-d9k2w" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.290624 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-nf5h5" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.298469 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-55mb6" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.301788 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-vlsq4"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.302733 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-vlsq4" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.326491 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-9vztr" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.336374 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b797b8dff-wnb2k"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.380604 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-d656998f4-d9k2w"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.381829 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7969689c84-lhbkq" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.393057 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmhcf\" (UniqueName: \"kubernetes.io/projected/d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5-kube-api-access-wmhcf\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9\" (UID: \"d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.393112 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl85j\" (UniqueName: \"kubernetes.io/projected/c866a446-9629-4d59-8953-0599bda45549-kube-api-access-bl85j\") pod \"nova-operator-controller-manager-cfbb9c588-ztqdk\" (UID: \"c866a446-9629-4d59-8953-0599bda45549\") " pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-ztqdk" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.393147 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc79r\" (UniqueName: \"kubernetes.io/projected/d44a318e-4d58-4719-a567-6d849321b946-kube-api-access-kc79r\") pod \"octavia-operator-controller-manager-54cfbf4c7d-68bjm\" (UID: \"d44a318e-4d58-4719-a567-6d849321b946\") " pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-68bjm" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.393181 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5lwf\" (UniqueName: \"kubernetes.io/projected/e71d0c55-c142-4a8d-8677-accf9858de48-kube-api-access-j5lwf\") pod \"ovn-operator-controller-manager-54fc5f65b7-hfx5m\" (UID: \"e71d0c55-c142-4a8d-8677-accf9858de48\") " pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-hfx5m" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.393199 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9\" (UID: \"d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.393248 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgvqb\" (UniqueName: \"kubernetes.io/projected/887a5b13-80a9-4d9c-8b14-6768805ca936-kube-api-access-xgvqb\") pod \"neutron-operator-controller-manager-78bd47f458-qw49c\" (UID: \"887a5b13-80a9-4d9c-8b14-6768805ca936\") " pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-qw49c" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.393272 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8xt4\" (UniqueName: \"kubernetes.io/projected/032f6fc1-7bde-422b-bbe0-d83027f069d0-kube-api-access-x8xt4\") pod \"mariadb-operator-controller-manager-54b5986bb8-gr8vb\" (UID: \"032f6fc1-7bde-422b-bbe0-d83027f069d0\") " pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-gr8vb" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.393306 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlvxr\" (UniqueName: \"kubernetes.io/projected/078fa30e-9d17-4ca9-911a-43a0376ffe8f-kube-api-access-hlvxr\") pod \"placement-operator-controller-manager-5b797b8dff-wnb2k\" (UID: \"078fa30e-9d17-4ca9-911a-43a0376ffe8f\") " pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-wnb2k" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.393330 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc9dl\" (UniqueName: \"kubernetes.io/projected/6523048e-98da-4e64-9c79-7bbeda6ea361-kube-api-access-lc9dl\") pod \"manila-operator-controller-manager-58f887965d-dkhsn\" (UID: \"6523048e-98da-4e64-9c79-7bbeda6ea361\") " pod="openstack-operators/manila-operator-controller-manager-58f887965d-dkhsn" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.395930 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5fg7\" (UniqueName: \"kubernetes.io/projected/e8259b74-ce3d-4875-ac32-71e4397b4c01-kube-api-access-t5fg7\") pod \"keystone-operator-controller-manager-7454b96578-8d2db\" (UID: \"e8259b74-ce3d-4875-ac32-71e4397b4c01\") " pod="openstack-operators/keystone-operator-controller-manager-7454b96578-8d2db" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.416320 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-29fxl" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.427694 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-vlsq4"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.432220 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl85j\" (UniqueName: \"kubernetes.io/projected/c866a446-9629-4d59-8953-0599bda45549-kube-api-access-bl85j\") pod \"nova-operator-controller-manager-cfbb9c588-ztqdk\" (UID: \"c866a446-9629-4d59-8953-0599bda45549\") " pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-ztqdk" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.455367 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc9dl\" (UniqueName: \"kubernetes.io/projected/6523048e-98da-4e64-9c79-7bbeda6ea361-kube-api-access-lc9dl\") pod \"manila-operator-controller-manager-58f887965d-dkhsn\" (UID: \"6523048e-98da-4e64-9c79-7bbeda6ea361\") " pod="openstack-operators/manila-operator-controller-manager-58f887965d-dkhsn" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.458132 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgvqb\" (UniqueName: \"kubernetes.io/projected/887a5b13-80a9-4d9c-8b14-6768805ca936-kube-api-access-xgvqb\") pod \"neutron-operator-controller-manager-78bd47f458-qw49c\" (UID: \"887a5b13-80a9-4d9c-8b14-6768805ca936\") " pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-qw49c" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.465142 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-8d2db" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.473782 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-b4c496f69-hb768"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.474895 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-b4c496f69-hb768" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.479062 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-bjl8v" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.479084 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-58f887965d-dkhsn" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.484821 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-b4c496f69-hb768"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.485026 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8xt4\" (UniqueName: \"kubernetes.io/projected/032f6fc1-7bde-422b-bbe0-d83027f069d0-kube-api-access-x8xt4\") pod \"mariadb-operator-controller-manager-54b5986bb8-gr8vb\" (UID: \"032f6fc1-7bde-422b-bbe0-d83027f069d0\") " pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-gr8vb" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.496195 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc79r\" (UniqueName: \"kubernetes.io/projected/d44a318e-4d58-4719-a567-6d849321b946-kube-api-access-kc79r\") pod \"octavia-operator-controller-manager-54cfbf4c7d-68bjm\" (UID: \"d44a318e-4d58-4719-a567-6d849321b946\") " pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-68bjm" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.496265 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5lwf\" (UniqueName: \"kubernetes.io/projected/e71d0c55-c142-4a8d-8677-accf9858de48-kube-api-access-j5lwf\") pod \"ovn-operator-controller-manager-54fc5f65b7-hfx5m\" (UID: \"e71d0c55-c142-4a8d-8677-accf9858de48\") " pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-hfx5m" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.496293 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9\" (UID: \"d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.496321 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b-cert\") pod \"infra-operator-controller-manager-7875d8bb94-2vq4f\" (UID: \"e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b\") " pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-2vq4f" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.496380 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9vbk\" (UniqueName: \"kubernetes.io/projected/5cbee4eb-c188-4a47-9d37-4de16bd79f07-kube-api-access-x9vbk\") pod \"swift-operator-controller-manager-d656998f4-d9k2w\" (UID: \"5cbee4eb-c188-4a47-9d37-4de16bd79f07\") " pod="openstack-operators/swift-operator-controller-manager-d656998f4-d9k2w" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.496412 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlvxr\" (UniqueName: \"kubernetes.io/projected/078fa30e-9d17-4ca9-911a-43a0376ffe8f-kube-api-access-hlvxr\") pod \"placement-operator-controller-manager-5b797b8dff-wnb2k\" (UID: \"078fa30e-9d17-4ca9-911a-43a0376ffe8f\") " pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-wnb2k" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.496445 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lklbz\" (UniqueName: \"kubernetes.io/projected/87f47e69-902e-4a6a-a6d6-ce72f960a9e4-kube-api-access-lklbz\") pod \"telemetry-operator-controller-manager-6d4bf84b58-vlsq4\" (UID: \"87f47e69-902e-4a6a-a6d6-ce72f960a9e4\") " pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-vlsq4" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.496473 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmhcf\" (UniqueName: \"kubernetes.io/projected/d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5-kube-api-access-wmhcf\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9\" (UID: \"d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9" Nov 22 03:08:03 crc kubenswrapper[4952]: E1122 03:08:03.502837 4952 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 22 03:08:03 crc kubenswrapper[4952]: E1122 03:08:03.502974 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5-cert podName:d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5 nodeName:}" failed. No retries permitted until 2025-11-22 03:08:04.002947936 +0000 UTC m=+848.308965209 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5-cert") pod "openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9" (UID: "d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 22 03:08:03 crc kubenswrapper[4952]: E1122 03:08:03.503131 4952 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 22 03:08:03 crc kubenswrapper[4952]: E1122 03:08:03.503262 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b-cert podName:e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b nodeName:}" failed. No retries permitted until 2025-11-22 03:08:04.503214163 +0000 UTC m=+848.809231436 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b-cert") pod "infra-operator-controller-manager-7875d8bb94-2vq4f" (UID: "e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b") : secret "infra-operator-webhook-server-cert" not found Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.505086 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-ztqdk" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.521362 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-qw49c" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.576105 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc79r\" (UniqueName: \"kubernetes.io/projected/d44a318e-4d58-4719-a567-6d849321b946-kube-api-access-kc79r\") pod \"octavia-operator-controller-manager-54cfbf4c7d-68bjm\" (UID: \"d44a318e-4d58-4719-a567-6d849321b946\") " pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-68bjm" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.576985 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5lwf\" (UniqueName: \"kubernetes.io/projected/e71d0c55-c142-4a8d-8677-accf9858de48-kube-api-access-j5lwf\") pod \"ovn-operator-controller-manager-54fc5f65b7-hfx5m\" (UID: \"e71d0c55-c142-4a8d-8677-accf9858de48\") " pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-hfx5m" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.599816 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9vbk\" (UniqueName: \"kubernetes.io/projected/5cbee4eb-c188-4a47-9d37-4de16bd79f07-kube-api-access-x9vbk\") pod \"swift-operator-controller-manager-d656998f4-d9k2w\" (UID: \"5cbee4eb-c188-4a47-9d37-4de16bd79f07\") " pod="openstack-operators/swift-operator-controller-manager-d656998f4-d9k2w" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.599892 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lklbz\" (UniqueName: \"kubernetes.io/projected/87f47e69-902e-4a6a-a6d6-ce72f960a9e4-kube-api-access-lklbz\") pod \"telemetry-operator-controller-manager-6d4bf84b58-vlsq4\" (UID: \"87f47e69-902e-4a6a-a6d6-ce72f960a9e4\") " pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-vlsq4" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.625016 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlvxr\" (UniqueName: \"kubernetes.io/projected/078fa30e-9d17-4ca9-911a-43a0376ffe8f-kube-api-access-hlvxr\") pod \"placement-operator-controller-manager-5b797b8dff-wnb2k\" (UID: \"078fa30e-9d17-4ca9-911a-43a0376ffe8f\") " pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-wnb2k" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.626054 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmhcf\" (UniqueName: \"kubernetes.io/projected/d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5-kube-api-access-wmhcf\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9\" (UID: \"d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.633238 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-hfx5m" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.637346 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lklbz\" (UniqueName: \"kubernetes.io/projected/87f47e69-902e-4a6a-a6d6-ce72f960a9e4-kube-api-access-lklbz\") pod \"telemetry-operator-controller-manager-6d4bf84b58-vlsq4\" (UID: \"87f47e69-902e-4a6a-a6d6-ce72f960a9e4\") " pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-vlsq4" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.732327 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9vbk\" (UniqueName: \"kubernetes.io/projected/5cbee4eb-c188-4a47-9d37-4de16bd79f07-kube-api-access-x9vbk\") pod \"swift-operator-controller-manager-d656998f4-d9k2w\" (UID: \"5cbee4eb-c188-4a47-9d37-4de16bd79f07\") " pod="openstack-operators/swift-operator-controller-manager-d656998f4-d9k2w" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.736097 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-wnb2k" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.737592 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8c6448b9f-ftxgw"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.738956 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-ftxgw" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.739007 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7286\" (UniqueName: \"kubernetes.io/projected/6d83cc9e-ff79-489f-ab6f-f8072ddcdbf2-kube-api-access-z7286\") pod \"test-operator-controller-manager-b4c496f69-hb768\" (UID: \"6d83cc9e-ff79-489f-ab6f-f8072ddcdbf2\") " pod="openstack-operators/test-operator-controller-manager-b4c496f69-hb768" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.747810 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-c8nck" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.754411 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-vlsq4" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.767710 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8c6448b9f-ftxgw"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.789282 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-gr8vb" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.841811 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7286\" (UniqueName: \"kubernetes.io/projected/6d83cc9e-ff79-489f-ab6f-f8072ddcdbf2-kube-api-access-z7286\") pod \"test-operator-controller-manager-b4c496f69-hb768\" (UID: \"6d83cc9e-ff79-489f-ab6f-f8072ddcdbf2\") " pod="openstack-operators/test-operator-controller-manager-b4c496f69-hb768" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.841914 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxbmj\" (UniqueName: \"kubernetes.io/projected/6791492d-51b4-4cd0-bbac-bd690baec76e-kube-api-access-sxbmj\") pod \"watcher-operator-controller-manager-8c6448b9f-ftxgw\" (UID: \"6791492d-51b4-4cd0-bbac-bd690baec76e\") " pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-ftxgw" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.842411 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-68bjm" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.855843 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-66b68cc995-2h722"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.858648 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-66b68cc995-2h722" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.866316 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-r278n" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.866590 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.866918 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-66b68cc995-2h722"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.904496 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r47mr"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.906171 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r47mr" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.915990 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7286\" (UniqueName: \"kubernetes.io/projected/6d83cc9e-ff79-489f-ab6f-f8072ddcdbf2-kube-api-access-z7286\") pod \"test-operator-controller-manager-b4c496f69-hb768\" (UID: \"6d83cc9e-ff79-489f-ab6f-f8072ddcdbf2\") " pod="openstack-operators/test-operator-controller-manager-b4c496f69-hb768" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.929387 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-ngp2b" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.968866 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mvtl\" (UniqueName: \"kubernetes.io/projected/ff89dfa5-e056-4753-99a0-4aad073a1734-kube-api-access-6mvtl\") pod \"openstack-operator-controller-manager-66b68cc995-2h722\" (UID: \"ff89dfa5-e056-4753-99a0-4aad073a1734\") " pod="openstack-operators/openstack-operator-controller-manager-66b68cc995-2h722" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.968942 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff89dfa5-e056-4753-99a0-4aad073a1734-cert\") pod \"openstack-operator-controller-manager-66b68cc995-2h722\" (UID: \"ff89dfa5-e056-4753-99a0-4aad073a1734\") " pod="openstack-operators/openstack-operator-controller-manager-66b68cc995-2h722" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.969037 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dpcb\" (UniqueName: \"kubernetes.io/projected/343962a9-648e-4dd5-a813-730ef99a136e-kube-api-access-2dpcb\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-r47mr\" (UID: \"343962a9-648e-4dd5-a813-730ef99a136e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r47mr" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.969087 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxbmj\" (UniqueName: \"kubernetes.io/projected/6791492d-51b4-4cd0-bbac-bd690baec76e-kube-api-access-sxbmj\") pod \"watcher-operator-controller-manager-8c6448b9f-ftxgw\" (UID: \"6791492d-51b4-4cd0-bbac-bd690baec76e\") " pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-ftxgw" Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.971108 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r47mr"] Nov 22 03:08:03 crc kubenswrapper[4952]: I1122 03:08:03.978466 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-fdcbbd9b5-w9lwq"] Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.024306 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-d656998f4-d9k2w" Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.042233 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxbmj\" (UniqueName: \"kubernetes.io/projected/6791492d-51b4-4cd0-bbac-bd690baec76e-kube-api-access-sxbmj\") pod \"watcher-operator-controller-manager-8c6448b9f-ftxgw\" (UID: \"6791492d-51b4-4cd0-bbac-bd690baec76e\") " pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-ftxgw" Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.070780 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mvtl\" (UniqueName: \"kubernetes.io/projected/ff89dfa5-e056-4753-99a0-4aad073a1734-kube-api-access-6mvtl\") pod \"openstack-operator-controller-manager-66b68cc995-2h722\" (UID: \"ff89dfa5-e056-4753-99a0-4aad073a1734\") " pod="openstack-operators/openstack-operator-controller-manager-66b68cc995-2h722" Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.070842 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff89dfa5-e056-4753-99a0-4aad073a1734-cert\") pod \"openstack-operator-controller-manager-66b68cc995-2h722\" (UID: \"ff89dfa5-e056-4753-99a0-4aad073a1734\") " pod="openstack-operators/openstack-operator-controller-manager-66b68cc995-2h722" Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.070883 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dpcb\" (UniqueName: \"kubernetes.io/projected/343962a9-648e-4dd5-a813-730ef99a136e-kube-api-access-2dpcb\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-r47mr\" (UID: \"343962a9-648e-4dd5-a813-730ef99a136e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r47mr" Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.070935 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9\" (UID: \"d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9" Nov 22 03:08:04 crc kubenswrapper[4952]: E1122 03:08:04.071057 4952 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 22 03:08:04 crc kubenswrapper[4952]: E1122 03:08:04.071125 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff89dfa5-e056-4753-99a0-4aad073a1734-cert podName:ff89dfa5-e056-4753-99a0-4aad073a1734 nodeName:}" failed. No retries permitted until 2025-11-22 03:08:04.571104771 +0000 UTC m=+848.877122034 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff89dfa5-e056-4753-99a0-4aad073a1734-cert") pod "openstack-operator-controller-manager-66b68cc995-2h722" (UID: "ff89dfa5-e056-4753-99a0-4aad073a1734") : secret "webhook-server-cert" not found Nov 22 03:08:04 crc kubenswrapper[4952]: E1122 03:08:04.071057 4952 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 22 03:08:04 crc kubenswrapper[4952]: E1122 03:08:04.071273 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5-cert podName:d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5 nodeName:}" failed. No retries permitted until 2025-11-22 03:08:05.071262995 +0000 UTC m=+849.377280268 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5-cert") pod "openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9" (UID: "d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.092483 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dpcb\" (UniqueName: \"kubernetes.io/projected/343962a9-648e-4dd5-a813-730ef99a136e-kube-api-access-2dpcb\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-r47mr\" (UID: \"343962a9-648e-4dd5-a813-730ef99a136e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r47mr" Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.094581 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-767ccfd65f-t2mlk"] Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.108519 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mvtl\" (UniqueName: \"kubernetes.io/projected/ff89dfa5-e056-4753-99a0-4aad073a1734-kube-api-access-6mvtl\") pod \"openstack-operator-controller-manager-66b68cc995-2h722\" (UID: \"ff89dfa5-e056-4753-99a0-4aad073a1734\") " pod="openstack-operators/openstack-operator-controller-manager-66b68cc995-2h722" Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.119406 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-ftxgw" Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.156346 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-b4c496f69-hb768" Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.205268 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r47mr" Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.261754 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-fdcbbd9b5-w9lwq" event={"ID":"2704d952-0ecf-42b9-9185-625d8c662a00","Type":"ContainerStarted","Data":"109244122b3b99d54b621be75b0470e20e59ae115fddcf2eebd7aeda4c768c98"} Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.303579 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75fb479bcc-j9bj9"] Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.580290 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b-cert\") pod \"infra-operator-controller-manager-7875d8bb94-2vq4f\" (UID: \"e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b\") " pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-2vq4f" Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.580389 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff89dfa5-e056-4753-99a0-4aad073a1734-cert\") pod \"openstack-operator-controller-manager-66b68cc995-2h722\" (UID: \"ff89dfa5-e056-4753-99a0-4aad073a1734\") " pod="openstack-operators/openstack-operator-controller-manager-66b68cc995-2h722" Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.589591 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b-cert\") pod \"infra-operator-controller-manager-7875d8bb94-2vq4f\" (UID: \"e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b\") " pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-2vq4f" Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.594269 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff89dfa5-e056-4753-99a0-4aad073a1734-cert\") pod \"openstack-operator-controller-manager-66b68cc995-2h722\" (UID: \"ff89dfa5-e056-4753-99a0-4aad073a1734\") " pod="openstack-operators/openstack-operator-controller-manager-66b68cc995-2h722" Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.605119 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-66b68cc995-2h722" Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.643713 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-2vq4f" Nov 22 03:08:04 crc kubenswrapper[4952]: I1122 03:08:04.766212 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-598f69df5d-str4f"] Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.096696 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9\" (UID: \"d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9" Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.102307 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9\" (UID: \"d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9" Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.103214 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9" Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.245568 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-54fc5f65b7-hfx5m"] Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.256893 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-cfbb9c588-ztqdk"] Nov 22 03:08:05 crc kubenswrapper[4952]: W1122 03:08:05.270740 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode71d0c55_c142_4a8d_8677_accf9858de48.slice/crio-43998f4be986ce97fa57c4229eaff0580cd9dc67dce2b9235a9de78f94f5cbec WatchSource:0}: Error finding container 43998f4be986ce97fa57c4229eaff0580cd9dc67dce2b9235a9de78f94f5cbec: Status 404 returned error can't find the container with id 43998f4be986ce97fa57c4229eaff0580cd9dc67dce2b9235a9de78f94f5cbec Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.272765 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-j9bj9" event={"ID":"3166414a-5d0f-460b-81cd-a8cfab489ff5","Type":"ContainerStarted","Data":"5f4283410944a5c9d4c4c19737a263e49e81deabeb0e7551075472b31be9ab71"} Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.299939 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-t2mlk" event={"ID":"dbb212fa-2d69-47c9-8ddc-3f1d78ce745a","Type":"ContainerStarted","Data":"c9ab2071918943f525febb6eaff30b98fe3d4dd7b8f24db6df7d4018f8e6b022"} Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.322700 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-str4f" event={"ID":"36ba8aa7-85ec-461d-a0d7-39f09c60289f","Type":"ContainerStarted","Data":"6da6601bdd62c5eefbf8a64a2e54f679c478fef0cf2dc67ebdcdcd4adf436c09"} Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.415837 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-99b499f4-g7gzn"] Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.435351 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-58f887965d-dkhsn"] Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.457720 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7454b96578-8d2db"] Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.476889 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-54b5986bb8-gr8vb"] Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.488907 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b797b8dff-wnb2k"] Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.497111 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7969689c84-lhbkq"] Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.502310 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-56f54d6746-29fxl"] Nov 22 03:08:05 crc kubenswrapper[4952]: W1122 03:08:05.521481 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod032f6fc1_7bde_422b_bbe0_d83027f069d0.slice/crio-11c265f9e99955671280beed33ce7186db68ace4436ed188cef2fb23708c2c40 WatchSource:0}: Error finding container 11c265f9e99955671280beed33ce7186db68ace4436ed188cef2fb23708c2c40: Status 404 returned error can't find the container with id 11c265f9e99955671280beed33ce7186db68ace4436ed188cef2fb23708c2c40 Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.538010 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-vlsq4"] Nov 22 03:08:05 crc kubenswrapper[4952]: W1122 03:08:05.539247 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87f47e69_902e_4a6a_a6d6_ce72f960a9e4.slice/crio-a70ba325d4076921242c80953bba6313e28372257d5c3987988139f3b2828866 WatchSource:0}: Error finding container a70ba325d4076921242c80953bba6313e28372257d5c3987988139f3b2828866: Status 404 returned error can't find the container with id a70ba325d4076921242c80953bba6313e28372257d5c3987988139f3b2828866 Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.559522 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78bd47f458-qw49c"] Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.678786 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r47mr"] Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.702813 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-d656998f4-d9k2w"] Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.718137 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-b4c496f69-hb768"] Nov 22 03:08:05 crc kubenswrapper[4952]: E1122 03:08:05.718202 4952 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2dpcb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-r47mr_openstack-operators(343962a9-648e-4dd5-a813-730ef99a136e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 03:08:05 crc kubenswrapper[4952]: E1122 03:08:05.720196 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r47mr" podUID="343962a9-648e-4dd5-a813-730ef99a136e" Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.731105 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8c6448b9f-ftxgw"] Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.768380 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7875d8bb94-2vq4f"] Nov 22 03:08:05 crc kubenswrapper[4952]: E1122 03:08:05.784804 4952 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sxbmj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-8c6448b9f-ftxgw_openstack-operators(6791492d-51b4-4cd0-bbac-bd690baec76e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 03:08:05 crc kubenswrapper[4952]: E1122 03:08:05.785206 4952 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z7286,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-b4c496f69-hb768_openstack-operators(6d83cc9e-ff79-489f-ab6f-f8072ddcdbf2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 03:08:05 crc kubenswrapper[4952]: E1122 03:08:05.787937 4952 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:f0688f6a55b7b548aaafd5c2c4f0749a43e7ea447c62a24e8b35257c5d8ba17f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mlvjp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-7875d8bb94-2vq4f_openstack-operators(e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.792730 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-68bjm"] Nov 22 03:08:05 crc kubenswrapper[4952]: E1122 03:08:05.812070 4952 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:442c269d79163f8da75505019c02e9f0815837aaadcaddacb8e6c12df297ca13,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kc79r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-54cfbf4c7d-68bjm_openstack-operators(d44a318e-4d58-4719-a567-6d849321b946): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.836607 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-66b68cc995-2h722"] Nov 22 03:08:05 crc kubenswrapper[4952]: I1122 03:08:05.846426 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9"] Nov 22 03:08:05 crc kubenswrapper[4952]: W1122 03:08:05.949796 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd324f1a8_46a5_4bb7_b84b_5f42a6a2c5d5.slice/crio-c2f6bc9a2f4cc4f40fd7b71c7016d8e0c7c69bcacbed90afd40499cf9317773e WatchSource:0}: Error finding container c2f6bc9a2f4cc4f40fd7b71c7016d8e0c7c69bcacbed90afd40499cf9317773e: Status 404 returned error can't find the container with id c2f6bc9a2f4cc4f40fd7b71c7016d8e0c7c69bcacbed90afd40499cf9317773e Nov 22 03:08:06 crc kubenswrapper[4952]: I1122 03:08:06.333590 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7969689c84-lhbkq" event={"ID":"996261d7-26a3-41f8-9531-73c3ec296c1d","Type":"ContainerStarted","Data":"acdb04a75098531e7d73a41acd6338234374d8d77d14028ffff451db6740cbaa"} Nov 22 03:08:06 crc kubenswrapper[4952]: I1122 03:08:06.334805 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-ztqdk" event={"ID":"c866a446-9629-4d59-8953-0599bda45549","Type":"ContainerStarted","Data":"b91c665a0baba3478f45f18d3e7285210b83eb161fc45ee04f9c1322230f6707"} Nov 22 03:08:06 crc kubenswrapper[4952]: I1122 03:08:06.338632 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-qw49c" event={"ID":"887a5b13-80a9-4d9c-8b14-6768805ca936","Type":"ContainerStarted","Data":"a2d1ce5c5cb74e1c957b793711b67dfaa1ce566ed05d9eab9a6ef3f7b1f62c08"} Nov 22 03:08:06 crc kubenswrapper[4952]: I1122 03:08:06.341021 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-29fxl" event={"ID":"0b8276cf-2a3d-40ad-83c6-fa522270b8a7","Type":"ContainerStarted","Data":"ae4fcdcdd85db3ed7058d036e33a9ffa70436eac4e1a79d6d2e22abf119da736"} Nov 22 03:08:06 crc kubenswrapper[4952]: I1122 03:08:06.342285 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-ftxgw" event={"ID":"6791492d-51b4-4cd0-bbac-bd690baec76e","Type":"ContainerStarted","Data":"3626ef84c31d8f72d3448abf37835e1331ebbded7f1933f4ab9a72aacaefb190"} Nov 22 03:08:06 crc kubenswrapper[4952]: I1122 03:08:06.344581 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-wnb2k" event={"ID":"078fa30e-9d17-4ca9-911a-43a0376ffe8f","Type":"ContainerStarted","Data":"37e6f15fccaffe4433f9173ea5a1551b399f18d03115b10c9af230bf27bad388"} Nov 22 03:08:06 crc kubenswrapper[4952]: I1122 03:08:06.346105 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-gr8vb" event={"ID":"032f6fc1-7bde-422b-bbe0-d83027f069d0","Type":"ContainerStarted","Data":"11c265f9e99955671280beed33ce7186db68ace4436ed188cef2fb23708c2c40"} Nov 22 03:08:06 crc kubenswrapper[4952]: I1122 03:08:06.347583 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-8d2db" event={"ID":"e8259b74-ce3d-4875-ac32-71e4397b4c01","Type":"ContainerStarted","Data":"11637b5b72d91b9cff3dcb3c279f965c7ecc24ba9bd84c17f32417635c66581c"} Nov 22 03:08:06 crc kubenswrapper[4952]: I1122 03:08:06.348886 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-hfx5m" event={"ID":"e71d0c55-c142-4a8d-8677-accf9858de48","Type":"ContainerStarted","Data":"43998f4be986ce97fa57c4229eaff0580cd9dc67dce2b9235a9de78f94f5cbec"} Nov 22 03:08:06 crc kubenswrapper[4952]: I1122 03:08:06.350452 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-b4c496f69-hb768" event={"ID":"6d83cc9e-ff79-489f-ab6f-f8072ddcdbf2","Type":"ContainerStarted","Data":"9c26d62c40e735be6757a04f9cf2d52ac3d7f45848bfd1289a996fd3a2f83ab4"} Nov 22 03:08:06 crc kubenswrapper[4952]: I1122 03:08:06.352991 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9" event={"ID":"d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5","Type":"ContainerStarted","Data":"c2f6bc9a2f4cc4f40fd7b71c7016d8e0c7c69bcacbed90afd40499cf9317773e"} Nov 22 03:08:06 crc kubenswrapper[4952]: I1122 03:08:06.354351 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-vlsq4" event={"ID":"87f47e69-902e-4a6a-a6d6-ce72f960a9e4","Type":"ContainerStarted","Data":"a70ba325d4076921242c80953bba6313e28372257d5c3987988139f3b2828866"} Nov 22 03:08:06 crc kubenswrapper[4952]: I1122 03:08:06.355967 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58f887965d-dkhsn" event={"ID":"6523048e-98da-4e64-9c79-7bbeda6ea361","Type":"ContainerStarted","Data":"71c3c896fc6d88858235555b81e2c6d534961aa53c95f2f7836b48a5b493d42e"} Nov 22 03:08:06 crc kubenswrapper[4952]: I1122 03:08:06.358679 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-2vq4f" event={"ID":"e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b","Type":"ContainerStarted","Data":"0d7675e3b2ecb53e229c55c98638dc5c85ad3967c3285c72abed1466dfee6f6d"} Nov 22 03:08:06 crc kubenswrapper[4952]: I1122 03:08:06.360614 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-g7gzn" event={"ID":"3de98aa0-3a15-4c46-adf3-d1715ccf5274","Type":"ContainerStarted","Data":"5a3d0329db321d860c28e29fd2b9b2a6296c3b72d09076e31eab01c6c6ce0732"} Nov 22 03:08:06 crc kubenswrapper[4952]: I1122 03:08:06.361757 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-68bjm" event={"ID":"d44a318e-4d58-4719-a567-6d849321b946","Type":"ContainerStarted","Data":"b51e3851ddec30572aa6faacbb88c90b56e8e8e31e1b0fb0d328da586dafc716"} Nov 22 03:08:06 crc kubenswrapper[4952]: I1122 03:08:06.364865 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-66b68cc995-2h722" event={"ID":"ff89dfa5-e056-4753-99a0-4aad073a1734","Type":"ContainerStarted","Data":"8871f7aadd68c8510a68e1c4384155de565a319e806b9ffd0f954c640848bae3"} Nov 22 03:08:06 crc kubenswrapper[4952]: I1122 03:08:06.366156 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r47mr" event={"ID":"343962a9-648e-4dd5-a813-730ef99a136e","Type":"ContainerStarted","Data":"587203ef145b66269a49c24281fc02f7c3c2ddd8e2aa4b5252c0d572d4174ec7"} Nov 22 03:08:06 crc kubenswrapper[4952]: E1122 03:08:06.367621 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r47mr" podUID="343962a9-648e-4dd5-a813-730ef99a136e" Nov 22 03:08:06 crc kubenswrapper[4952]: I1122 03:08:06.369829 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d656998f4-d9k2w" event={"ID":"5cbee4eb-c188-4a47-9d37-4de16bd79f07","Type":"ContainerStarted","Data":"f402b8fbcf5ad1408c32e42e40ad61b724723cf038fdc81b8b70f37afc73e8e8"} Nov 22 03:08:07 crc kubenswrapper[4952]: E1122 03:08:07.381397 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r47mr" podUID="343962a9-648e-4dd5-a813-730ef99a136e" Nov 22 03:08:08 crc kubenswrapper[4952]: E1122 03:08:08.628159 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-ftxgw" podUID="6791492d-51b4-4cd0-bbac-bd690baec76e" Nov 22 03:08:08 crc kubenswrapper[4952]: E1122 03:08:08.631343 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-b4c496f69-hb768" podUID="6d83cc9e-ff79-489f-ab6f-f8072ddcdbf2" Nov 22 03:08:09 crc kubenswrapper[4952]: I1122 03:08:09.401604 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-ftxgw" event={"ID":"6791492d-51b4-4cd0-bbac-bd690baec76e","Type":"ContainerStarted","Data":"823a45fd35ac544ab700ed39ac151dfc5a22e91efb93e73408b71931adccc796"} Nov 22 03:08:09 crc kubenswrapper[4952]: E1122 03:08:09.407314 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-ftxgw" podUID="6791492d-51b4-4cd0-bbac-bd690baec76e" Nov 22 03:08:09 crc kubenswrapper[4952]: I1122 03:08:09.412313 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-b4c496f69-hb768" event={"ID":"6d83cc9e-ff79-489f-ab6f-f8072ddcdbf2","Type":"ContainerStarted","Data":"661463b6ca8060984a72b91cb346e6633884d94d619deab523024ffbbacd5268"} Nov 22 03:08:09 crc kubenswrapper[4952]: E1122 03:08:09.420301 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d\\\"\"" pod="openstack-operators/test-operator-controller-manager-b4c496f69-hb768" podUID="6d83cc9e-ff79-489f-ab6f-f8072ddcdbf2" Nov 22 03:08:10 crc kubenswrapper[4952]: E1122 03:08:10.424661 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d\\\"\"" pod="openstack-operators/test-operator-controller-manager-b4c496f69-hb768" podUID="6d83cc9e-ff79-489f-ab6f-f8072ddcdbf2" Nov 22 03:08:10 crc kubenswrapper[4952]: E1122 03:08:10.425734 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-ftxgw" podUID="6791492d-51b4-4cd0-bbac-bd690baec76e" Nov 22 03:08:18 crc kubenswrapper[4952]: E1122 03:08:18.484990 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-2vq4f" podUID="e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b" Nov 22 03:08:18 crc kubenswrapper[4952]: I1122 03:08:18.493068 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-66b68cc995-2h722" event={"ID":"ff89dfa5-e056-4753-99a0-4aad073a1734","Type":"ContainerStarted","Data":"282eb697d703c6f6e9a530f92201b311b507bb12a6538044221fae8ad9999b37"} Nov 22 03:08:18 crc kubenswrapper[4952]: I1122 03:08:18.498990 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-hfx5m" event={"ID":"e71d0c55-c142-4a8d-8677-accf9858de48","Type":"ContainerStarted","Data":"1f1a78e63c0adcf28efb1774e167fb8c0b7f2200b062ddb8b84c7cbbf984ba6e"} Nov 22 03:08:18 crc kubenswrapper[4952]: E1122 03:08:18.509845 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-68bjm" podUID="d44a318e-4d58-4719-a567-6d849321b946" Nov 22 03:08:19 crc kubenswrapper[4952]: I1122 03:08:19.524347 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9" event={"ID":"d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5","Type":"ContainerStarted","Data":"a62a2a9bb207ec16baef9af26c03820876954f3903c5355f5db8301476f0d9a9"} Nov 22 03:08:19 crc kubenswrapper[4952]: I1122 03:08:19.553044 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-vlsq4" event={"ID":"87f47e69-902e-4a6a-a6d6-ce72f960a9e4","Type":"ContainerStarted","Data":"b53d0b39b2ae985862829f32c1eb3c1094a9687613ce9128b216f17dc35e2485"} Nov 22 03:08:19 crc kubenswrapper[4952]: I1122 03:08:19.576603 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-fdcbbd9b5-w9lwq" event={"ID":"2704d952-0ecf-42b9-9185-625d8c662a00","Type":"ContainerStarted","Data":"448c5b4c873be327eecce17015f6444ea9948262051fe3825a50189f3a3312a6"} Nov 22 03:08:19 crc kubenswrapper[4952]: I1122 03:08:19.581559 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7969689c84-lhbkq" event={"ID":"996261d7-26a3-41f8-9531-73c3ec296c1d","Type":"ContainerStarted","Data":"61b505ffbb42bebbf9785347b66c6d1fb41d987cb48292aed3d87f62b5c23f1b"} Nov 22 03:08:19 crc kubenswrapper[4952]: I1122 03:08:19.584810 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-2vq4f" event={"ID":"e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b","Type":"ContainerStarted","Data":"73173c1bbf2390488ff13dda1d4340db2bd14bbd6f238d909bf4cd4bcdda4140"} Nov 22 03:08:19 crc kubenswrapper[4952]: I1122 03:08:19.586027 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-29fxl" event={"ID":"0b8276cf-2a3d-40ad-83c6-fa522270b8a7","Type":"ContainerStarted","Data":"665758089f873984f81785a852ef0e4f6c1e48168ee6aedd237f446f1d8f28c5"} Nov 22 03:08:19 crc kubenswrapper[4952]: I1122 03:08:19.592323 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-68bjm" event={"ID":"d44a318e-4d58-4719-a567-6d849321b946","Type":"ContainerStarted","Data":"acf22d4541f4b508a2f6f3808616a40fc5a04cb8dfc206c5a00292fb11601421"} Nov 22 03:08:19 crc kubenswrapper[4952]: I1122 03:08:19.630848 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-str4f" event={"ID":"36ba8aa7-85ec-461d-a0d7-39f09c60289f","Type":"ContainerStarted","Data":"d585dc5943e34228c5d577f1dd372ed07a0fd825ba2633a7e648fefe644aa9d5"} Nov 22 03:08:19 crc kubenswrapper[4952]: I1122 03:08:19.666229 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-qw49c" event={"ID":"887a5b13-80a9-4d9c-8b14-6768805ca936","Type":"ContainerStarted","Data":"aa6e8b38a4490a4a7b7f1874817e1c684e127263fecfe6aca82b8612b8b5d8e0"} Nov 22 03:08:19 crc kubenswrapper[4952]: I1122 03:08:19.709090 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-ztqdk" event={"ID":"c866a446-9629-4d59-8953-0599bda45549","Type":"ContainerStarted","Data":"f798d2c5ec75224bc1827d711716b2a443531d494c5074d96633df9bbe8fcab2"} Nov 22 03:08:19 crc kubenswrapper[4952]: I1122 03:08:19.759039 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-gr8vb" event={"ID":"032f6fc1-7bde-422b-bbe0-d83027f069d0","Type":"ContainerStarted","Data":"037e6c7068892aa25e20f7d58afd73007bc3d5ec386fc126048842d609ee7d12"} Nov 22 03:08:19 crc kubenswrapper[4952]: I1122 03:08:19.795128 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-j9bj9" event={"ID":"3166414a-5d0f-460b-81cd-a8cfab489ff5","Type":"ContainerStarted","Data":"2d712a7d77b4afc995b27518acc5a86952e8ee874d8c8010f7763e223c2ce94e"} Nov 22 03:08:19 crc kubenswrapper[4952]: I1122 03:08:19.809427 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58f887965d-dkhsn" event={"ID":"6523048e-98da-4e64-9c79-7bbeda6ea361","Type":"ContainerStarted","Data":"dcb1621972763943c9c0f9d302585f58f43cc6cdb8cd6f93e379d54c8828b5cb"} Nov 22 03:08:19 crc kubenswrapper[4952]: I1122 03:08:19.814103 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-t2mlk" event={"ID":"dbb212fa-2d69-47c9-8ddc-3f1d78ce745a","Type":"ContainerStarted","Data":"fc3174374e5c1e2350752d7c95e840123980a0eadd8f03755d3ef7af4b319589"} Nov 22 03:08:19 crc kubenswrapper[4952]: I1122 03:08:19.840995 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-g7gzn" event={"ID":"3de98aa0-3a15-4c46-adf3-d1715ccf5274","Type":"ContainerStarted","Data":"b79ce78276ebdf08a8f14c607945683a19135d860bb4f25ef942d6ca570cdc98"} Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.859016 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7969689c84-lhbkq" event={"ID":"996261d7-26a3-41f8-9531-73c3ec296c1d","Type":"ContainerStarted","Data":"2b9d601537fa414c8cfbed8d1dd5b27b7898f859daeb8e79e70ab3e674eadd82"} Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.859143 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7969689c84-lhbkq" Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.870286 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-8d2db" event={"ID":"e8259b74-ce3d-4875-ac32-71e4397b4c01","Type":"ContainerStarted","Data":"f3a8ebff8e943461884f877a9c6d70e4725d7d6a80d9a9e9c4b08cd1cc4a8fb3"} Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.870342 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-8d2db" event={"ID":"e8259b74-ce3d-4875-ac32-71e4397b4c01","Type":"ContainerStarted","Data":"7003fe2ce5bcfb6b587a9006cbb51cee4fb6f9a80aa5f43116364877d33faefd"} Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.870448 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-8d2db" Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.876968 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-t2mlk" event={"ID":"dbb212fa-2d69-47c9-8ddc-3f1d78ce745a","Type":"ContainerStarted","Data":"9df4d23ee92546fae0473ec30d0d5b9fae9542729670c0381bc5ea0f798f5d1b"} Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.877916 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-t2mlk" Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.885114 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-str4f" event={"ID":"36ba8aa7-85ec-461d-a0d7-39f09c60289f","Type":"ContainerStarted","Data":"6d049d63c43666bfd8d1df8cc62ff145b59b25298805f26232baf32e3a66e085"} Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.885809 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-str4f" Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.890438 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7969689c84-lhbkq" podStartSLOduration=6.334911957 podStartE2EDuration="18.890420606s" podCreationTimestamp="2025-11-22 03:08:02 +0000 UTC" firstStartedPulling="2025-11-22 03:08:05.548730132 +0000 UTC m=+849.854747405" lastFinishedPulling="2025-11-22 03:08:18.104238771 +0000 UTC m=+862.410256054" observedRunningTime="2025-11-22 03:08:20.889758348 +0000 UTC m=+865.195775621" watchObservedRunningTime="2025-11-22 03:08:20.890420606 +0000 UTC m=+865.196437879" Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.893003 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-g7gzn" event={"ID":"3de98aa0-3a15-4c46-adf3-d1715ccf5274","Type":"ContainerStarted","Data":"6fce5024c14a0788cc7150fdcfec66e57e888eb357bf01795e7857af7c697466"} Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.893750 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-g7gzn" Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.904697 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-vlsq4" event={"ID":"87f47e69-902e-4a6a-a6d6-ce72f960a9e4","Type":"ContainerStarted","Data":"4c836890f3851971f0b019a90b073005071436364a3631dc1b9cb1504745ae4a"} Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.905481 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-vlsq4" Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.907699 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58f887965d-dkhsn" event={"ID":"6523048e-98da-4e64-9c79-7bbeda6ea361","Type":"ContainerStarted","Data":"55022d2f69125e9c69eb705c95dc7eddd0de3a4992c15d36a1a9476cc722e000"} Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.908153 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-58f887965d-dkhsn" Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.914829 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-8d2db" podStartSLOduration=6.326399927 podStartE2EDuration="18.914810317s" podCreationTimestamp="2025-11-22 03:08:02 +0000 UTC" firstStartedPulling="2025-11-22 03:08:05.519219123 +0000 UTC m=+849.825236396" lastFinishedPulling="2025-11-22 03:08:18.107629483 +0000 UTC m=+862.413646786" observedRunningTime="2025-11-22 03:08:20.908533836 +0000 UTC m=+865.214551099" watchObservedRunningTime="2025-11-22 03:08:20.914810317 +0000 UTC m=+865.220827590" Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.922438 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-ztqdk" event={"ID":"c866a446-9629-4d59-8953-0599bda45549","Type":"ContainerStarted","Data":"c1490364e2e8f81482bcd3b2c405b967cedfbaea12cdbf8c8c91ac64b18d8467"} Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.923411 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-ztqdk" Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.938844 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-hfx5m" event={"ID":"e71d0c55-c142-4a8d-8677-accf9858de48","Type":"ContainerStarted","Data":"7214e9d4a4d99f494e5bb816043646783dfa250b99be4613236e2573765f88e1"} Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.939492 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-hfx5m" Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.970165 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-66b68cc995-2h722" event={"ID":"ff89dfa5-e056-4753-99a0-4aad073a1734","Type":"ContainerStarted","Data":"c5e59798f4275327337d648aca0eeb6781d42ba0f4916b9c50fade6a3433f2e8"} Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.971774 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-66b68cc995-2h722" Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.977004 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-wnb2k" event={"ID":"078fa30e-9d17-4ca9-911a-43a0376ffe8f","Type":"ContainerStarted","Data":"e0ed5dae1a6d6ecb11d54c8d4a130613e757d1612653937804c90b61a3f2a654"} Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.977036 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-wnb2k" event={"ID":"078fa30e-9d17-4ca9-911a-43a0376ffe8f","Type":"ContainerStarted","Data":"82cc99d1e86ad68f868108fe29e3e02f9c873da2f953c9917314b9eb5453dfc5"} Nov 22 03:08:20 crc kubenswrapper[4952]: I1122 03:08:20.978165 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-wnb2k" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.007594 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-fdcbbd9b5-w9lwq" event={"ID":"2704d952-0ecf-42b9-9185-625d8c662a00","Type":"ContainerStarted","Data":"22e76508fe690c380441181cc8ce9a35d4bec0991dfd860d3a4576ceaac171d8"} Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.008023 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-fdcbbd9b5-w9lwq" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.023499 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-t2mlk" podStartSLOduration=5.8264822590000005 podStartE2EDuration="19.023472498s" podCreationTimestamp="2025-11-22 03:08:02 +0000 UTC" firstStartedPulling="2025-11-22 03:08:04.260079738 +0000 UTC m=+848.566097011" lastFinishedPulling="2025-11-22 03:08:17.457069977 +0000 UTC m=+861.763087250" observedRunningTime="2025-11-22 03:08:20.999446198 +0000 UTC m=+865.305463471" watchObservedRunningTime="2025-11-22 03:08:21.023472498 +0000 UTC m=+865.329489761" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.023854 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-str4f" podStartSLOduration=5.723043229 podStartE2EDuration="19.023848189s" podCreationTimestamp="2025-11-22 03:08:02 +0000 UTC" firstStartedPulling="2025-11-22 03:08:04.782064753 +0000 UTC m=+849.088082026" lastFinishedPulling="2025-11-22 03:08:18.082869693 +0000 UTC m=+862.388886986" observedRunningTime="2025-11-22 03:08:20.956047933 +0000 UTC m=+865.262065206" watchObservedRunningTime="2025-11-22 03:08:21.023848189 +0000 UTC m=+865.329865462" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.037862 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-ztqdk" podStartSLOduration=6.255594659 podStartE2EDuration="19.037717855s" podCreationTimestamp="2025-11-22 03:08:02 +0000 UTC" firstStartedPulling="2025-11-22 03:08:05.299618226 +0000 UTC m=+849.605635499" lastFinishedPulling="2025-11-22 03:08:18.081741412 +0000 UTC m=+862.387758695" observedRunningTime="2025-11-22 03:08:21.032682718 +0000 UTC m=+865.338700011" watchObservedRunningTime="2025-11-22 03:08:21.037717855 +0000 UTC m=+865.343735128" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.049931 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-gr8vb" event={"ID":"032f6fc1-7bde-422b-bbe0-d83027f069d0","Type":"ContainerStarted","Data":"d82fc786e8ea6874e61b9ddcece92da94482ebed5b96c061f06759f351c38e12"} Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.050830 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-gr8vb" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.071376 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-vlsq4" podStartSLOduration=5.513943196 podStartE2EDuration="18.071351036s" podCreationTimestamp="2025-11-22 03:08:03 +0000 UTC" firstStartedPulling="2025-11-22 03:08:05.578962411 +0000 UTC m=+849.884979684" lastFinishedPulling="2025-11-22 03:08:18.136370241 +0000 UTC m=+862.442387524" observedRunningTime="2025-11-22 03:08:21.069412482 +0000 UTC m=+865.375429765" watchObservedRunningTime="2025-11-22 03:08:21.071351036 +0000 UTC m=+865.377368309" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.088472 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-j9bj9" event={"ID":"3166414a-5d0f-460b-81cd-a8cfab489ff5","Type":"ContainerStarted","Data":"486b70347317f4a14a044908d225a6b4bab560357ea6dd112c11a26c15b05dd1"} Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.089432 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-j9bj9" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.091473 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9" event={"ID":"d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5","Type":"ContainerStarted","Data":"000195f56a2b8ac14c2fa44706da6860848a504347a26de5c4584984d234bc7c"} Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.092083 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.110443 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-wnb2k" podStartSLOduration=5.524508611 podStartE2EDuration="18.110423834s" podCreationTimestamp="2025-11-22 03:08:03 +0000 UTC" firstStartedPulling="2025-11-22 03:08:05.522190653 +0000 UTC m=+849.828207926" lastFinishedPulling="2025-11-22 03:08:18.108105856 +0000 UTC m=+862.414123149" observedRunningTime="2025-11-22 03:08:21.105740726 +0000 UTC m=+865.411757999" watchObservedRunningTime="2025-11-22 03:08:21.110423834 +0000 UTC m=+865.416441107" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.115730 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-qw49c" event={"ID":"887a5b13-80a9-4d9c-8b14-6768805ca936","Type":"ContainerStarted","Data":"5575b4469f6bc23b84acef149a0377a0b4397acbe6201f0e565d4e21c75803ff"} Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.116721 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-qw49c" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.121527 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-29fxl" event={"ID":"0b8276cf-2a3d-40ad-83c6-fa522270b8a7","Type":"ContainerStarted","Data":"c41cd6ba63d0196b90deae4414029cf0f7692dbc759cc2a3cde64be3eab9a58a"} Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.122011 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-29fxl" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.128289 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d656998f4-d9k2w" event={"ID":"5cbee4eb-c188-4a47-9d37-4de16bd79f07","Type":"ContainerStarted","Data":"b1de90b70a328d0c3f28e01b403375eb464ddf745e3ce5bb88a89cb5c860b255"} Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.128315 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d656998f4-d9k2w" event={"ID":"5cbee4eb-c188-4a47-9d37-4de16bd79f07","Type":"ContainerStarted","Data":"f90fff1eda09d57e26116d9f94425dbad404bbdba3cbe1d6fa8d3bf70f050171"} Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.128670 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-d656998f4-d9k2w" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.137057 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-58f887965d-dkhsn" podStartSLOduration=6.500080069 podStartE2EDuration="19.137033294s" podCreationTimestamp="2025-11-22 03:08:02 +0000 UTC" firstStartedPulling="2025-11-22 03:08:05.471767248 +0000 UTC m=+849.777784521" lastFinishedPulling="2025-11-22 03:08:18.108720473 +0000 UTC m=+862.414737746" observedRunningTime="2025-11-22 03:08:21.13616116 +0000 UTC m=+865.442178433" watchObservedRunningTime="2025-11-22 03:08:21.137033294 +0000 UTC m=+865.443050567" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.177598 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-66b68cc995-2h722" podStartSLOduration=18.177578771 podStartE2EDuration="18.177578771s" podCreationTimestamp="2025-11-22 03:08:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:08:21.171590079 +0000 UTC m=+865.477607352" watchObservedRunningTime="2025-11-22 03:08:21.177578771 +0000 UTC m=+865.483596034" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.218989 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-g7gzn" podStartSLOduration=6.580440975 podStartE2EDuration="19.218961822s" podCreationTimestamp="2025-11-22 03:08:02 +0000 UTC" firstStartedPulling="2025-11-22 03:08:05.468849109 +0000 UTC m=+849.774866382" lastFinishedPulling="2025-11-22 03:08:18.107369956 +0000 UTC m=+862.413387229" observedRunningTime="2025-11-22 03:08:21.216060664 +0000 UTC m=+865.522077957" watchObservedRunningTime="2025-11-22 03:08:21.218961822 +0000 UTC m=+865.524979105" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.238359 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-hfx5m" podStartSLOduration=10.010948576 podStartE2EDuration="19.238333287s" podCreationTimestamp="2025-11-22 03:08:02 +0000 UTC" firstStartedPulling="2025-11-22 03:08:05.29532893 +0000 UTC m=+849.601346203" lastFinishedPulling="2025-11-22 03:08:14.522713641 +0000 UTC m=+858.828730914" observedRunningTime="2025-11-22 03:08:21.237916936 +0000 UTC m=+865.543934209" watchObservedRunningTime="2025-11-22 03:08:21.238333287 +0000 UTC m=+865.544350560" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.270470 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-d656998f4-d9k2w" podStartSLOduration=5.8744001059999995 podStartE2EDuration="18.270447297s" podCreationTimestamp="2025-11-22 03:08:03 +0000 UTC" firstStartedPulling="2025-11-22 03:08:05.713454193 +0000 UTC m=+850.019471456" lastFinishedPulling="2025-11-22 03:08:18.109501364 +0000 UTC m=+862.415518647" observedRunningTime="2025-11-22 03:08:21.260698973 +0000 UTC m=+865.566716246" watchObservedRunningTime="2025-11-22 03:08:21.270447297 +0000 UTC m=+865.576464570" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.288381 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-qw49c" podStartSLOduration=6.724062055 podStartE2EDuration="19.288368692s" podCreationTimestamp="2025-11-22 03:08:02 +0000 UTC" firstStartedPulling="2025-11-22 03:08:05.540217532 +0000 UTC m=+849.846234795" lastFinishedPulling="2025-11-22 03:08:18.104524159 +0000 UTC m=+862.410541432" observedRunningTime="2025-11-22 03:08:21.285281668 +0000 UTC m=+865.591298941" watchObservedRunningTime="2025-11-22 03:08:21.288368692 +0000 UTC m=+865.594385965" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.306780 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-29fxl" podStartSLOduration=6.776091814 podStartE2EDuration="19.30675879s" podCreationTimestamp="2025-11-22 03:08:02 +0000 UTC" firstStartedPulling="2025-11-22 03:08:05.578783066 +0000 UTC m=+849.884800339" lastFinishedPulling="2025-11-22 03:08:18.109450042 +0000 UTC m=+862.415467315" observedRunningTime="2025-11-22 03:08:21.303939203 +0000 UTC m=+865.609956486" watchObservedRunningTime="2025-11-22 03:08:21.30675879 +0000 UTC m=+865.612776063" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.328497 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-gr8vb" podStartSLOduration=6.88344091 podStartE2EDuration="19.328476187s" podCreationTimestamp="2025-11-22 03:08:02 +0000 UTC" firstStartedPulling="2025-11-22 03:08:05.536577764 +0000 UTC m=+849.842595037" lastFinishedPulling="2025-11-22 03:08:17.981613041 +0000 UTC m=+862.287630314" observedRunningTime="2025-11-22 03:08:21.322492776 +0000 UTC m=+865.628510049" watchObservedRunningTime="2025-11-22 03:08:21.328476187 +0000 UTC m=+865.634493460" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.353776 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-fdcbbd9b5-w9lwq" podStartSLOduration=7.867573159 podStartE2EDuration="19.353750402s" podCreationTimestamp="2025-11-22 03:08:02 +0000 UTC" firstStartedPulling="2025-11-22 03:08:03.988217937 +0000 UTC m=+848.294235210" lastFinishedPulling="2025-11-22 03:08:15.47439518 +0000 UTC m=+859.780412453" observedRunningTime="2025-11-22 03:08:21.345114788 +0000 UTC m=+865.651132071" watchObservedRunningTime="2025-11-22 03:08:21.353750402 +0000 UTC m=+865.659767675" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.375210 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-j9bj9" podStartSLOduration=5.788398549 podStartE2EDuration="19.375176423s" podCreationTimestamp="2025-11-22 03:08:02 +0000 UTC" firstStartedPulling="2025-11-22 03:08:04.396086611 +0000 UTC m=+848.702103884" lastFinishedPulling="2025-11-22 03:08:17.982864485 +0000 UTC m=+862.288881758" observedRunningTime="2025-11-22 03:08:21.370893077 +0000 UTC m=+865.676910350" watchObservedRunningTime="2025-11-22 03:08:21.375176423 +0000 UTC m=+865.681193696" Nov 22 03:08:21 crc kubenswrapper[4952]: I1122 03:08:21.404601 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9" podStartSLOduration=7.34835723 podStartE2EDuration="19.404578459s" podCreationTimestamp="2025-11-22 03:08:02 +0000 UTC" firstStartedPulling="2025-11-22 03:08:06.046264505 +0000 UTC m=+850.352281778" lastFinishedPulling="2025-11-22 03:08:18.102485724 +0000 UTC m=+862.408503007" observedRunningTime="2025-11-22 03:08:21.399220003 +0000 UTC m=+865.705237286" watchObservedRunningTime="2025-11-22 03:08:21.404578459 +0000 UTC m=+865.710595732" Nov 22 03:08:23 crc kubenswrapper[4952]: I1122 03:08:23.003225 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-j9bj9" Nov 22 03:08:23 crc kubenswrapper[4952]: I1122 03:08:23.028299 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-fdcbbd9b5-w9lwq" Nov 22 03:08:23 crc kubenswrapper[4952]: I1122 03:08:23.034446 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-t2mlk" Nov 22 03:08:23 crc kubenswrapper[4952]: I1122 03:08:23.139463 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-str4f" Nov 22 03:08:23 crc kubenswrapper[4952]: I1122 03:08:23.152830 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-ztqdk" Nov 22 03:08:23 crc kubenswrapper[4952]: I1122 03:08:23.173887 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-hfx5m" Nov 22 03:08:23 crc kubenswrapper[4952]: I1122 03:08:23.385261 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7969689c84-lhbkq" Nov 22 03:08:24 crc kubenswrapper[4952]: I1122 03:08:24.029179 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-d656998f4-d9k2w" Nov 22 03:08:24 crc kubenswrapper[4952]: I1122 03:08:24.621312 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-66b68cc995-2h722" Nov 22 03:08:25 crc kubenswrapper[4952]: I1122 03:08:25.111926 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9" Nov 22 03:08:25 crc kubenswrapper[4952]: I1122 03:08:25.188779 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-2vq4f" event={"ID":"e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b","Type":"ContainerStarted","Data":"26d5072090b8d40225e689b0286df916a7d74814120461f6e47a29daf7def88c"} Nov 22 03:08:25 crc kubenswrapper[4952]: I1122 03:08:25.189134 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-2vq4f" Nov 22 03:08:25 crc kubenswrapper[4952]: I1122 03:08:25.214177 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-2vq4f" podStartSLOduration=4.265325306 podStartE2EDuration="23.214154024s" podCreationTimestamp="2025-11-22 03:08:02 +0000 UTC" firstStartedPulling="2025-11-22 03:08:05.787856257 +0000 UTC m=+850.093873530" lastFinishedPulling="2025-11-22 03:08:24.736684975 +0000 UTC m=+869.042702248" observedRunningTime="2025-11-22 03:08:25.203975869 +0000 UTC m=+869.509993142" watchObservedRunningTime="2025-11-22 03:08:25.214154024 +0000 UTC m=+869.520171287" Nov 22 03:08:26 crc kubenswrapper[4952]: I1122 03:08:26.205325 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-68bjm" event={"ID":"d44a318e-4d58-4719-a567-6d849321b946","Type":"ContainerStarted","Data":"ce79863b5e48c0deb865fc91fec38da6c0afa21d7f30f36598eccdef2600fa01"} Nov 22 03:08:27 crc kubenswrapper[4952]: I1122 03:08:27.215916 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r47mr" event={"ID":"343962a9-648e-4dd5-a813-730ef99a136e","Type":"ContainerStarted","Data":"ad6ce2cf9b8f8f702ff632ba9623038973caf5daf96d6e44b3a91916579465f3"} Nov 22 03:08:27 crc kubenswrapper[4952]: I1122 03:08:27.216020 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-68bjm" Nov 22 03:08:27 crc kubenswrapper[4952]: I1122 03:08:27.249576 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r47mr" podStartSLOduration=4.224276902 podStartE2EDuration="24.249526718s" podCreationTimestamp="2025-11-22 03:08:03 +0000 UTC" firstStartedPulling="2025-11-22 03:08:05.718065567 +0000 UTC m=+850.024082840" lastFinishedPulling="2025-11-22 03:08:25.743315383 +0000 UTC m=+870.049332656" observedRunningTime="2025-11-22 03:08:27.244573965 +0000 UTC m=+871.550591248" watchObservedRunningTime="2025-11-22 03:08:27.249526718 +0000 UTC m=+871.555543981" Nov 22 03:08:27 crc kubenswrapper[4952]: I1122 03:08:27.272777 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-68bjm" podStartSLOduration=6.287148833 podStartE2EDuration="25.272746847s" podCreationTimestamp="2025-11-22 03:08:02 +0000 UTC" firstStartedPulling="2025-11-22 03:08:05.811894358 +0000 UTC m=+850.117911631" lastFinishedPulling="2025-11-22 03:08:24.797492372 +0000 UTC m=+869.103509645" observedRunningTime="2025-11-22 03:08:27.266288592 +0000 UTC m=+871.572305865" watchObservedRunningTime="2025-11-22 03:08:27.272746847 +0000 UTC m=+871.578764120" Nov 22 03:08:29 crc kubenswrapper[4952]: I1122 03:08:29.236596 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-b4c496f69-hb768" event={"ID":"6d83cc9e-ff79-489f-ab6f-f8072ddcdbf2","Type":"ContainerStarted","Data":"6dd06faa55d485554da7c16f70b4150778d560615950ba75e40da8dac2582ebc"} Nov 22 03:08:29 crc kubenswrapper[4952]: I1122 03:08:29.237669 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-b4c496f69-hb768" Nov 22 03:08:29 crc kubenswrapper[4952]: I1122 03:08:29.253657 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-ftxgw" event={"ID":"6791492d-51b4-4cd0-bbac-bd690baec76e","Type":"ContainerStarted","Data":"962cff1c409d6721fa93f7d88ce82a0c6fa90c50528d38107a8656012172f695"} Nov 22 03:08:29 crc kubenswrapper[4952]: I1122 03:08:29.254696 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-ftxgw" Nov 22 03:08:29 crc kubenswrapper[4952]: I1122 03:08:29.270412 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-b4c496f69-hb768" podStartSLOduration=3.967370536 podStartE2EDuration="26.270389349s" podCreationTimestamp="2025-11-22 03:08:03 +0000 UTC" firstStartedPulling="2025-11-22 03:08:05.785108753 +0000 UTC m=+850.091126016" lastFinishedPulling="2025-11-22 03:08:28.088127546 +0000 UTC m=+872.394144829" observedRunningTime="2025-11-22 03:08:29.266040362 +0000 UTC m=+873.572057645" watchObservedRunningTime="2025-11-22 03:08:29.270389349 +0000 UTC m=+873.576406622" Nov 22 03:08:29 crc kubenswrapper[4952]: I1122 03:08:29.291241 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-ftxgw" podStartSLOduration=3.964230231 podStartE2EDuration="26.291214453s" podCreationTimestamp="2025-11-22 03:08:03 +0000 UTC" firstStartedPulling="2025-11-22 03:08:05.784649391 +0000 UTC m=+850.090666664" lastFinishedPulling="2025-11-22 03:08:28.111633603 +0000 UTC m=+872.417650886" observedRunningTime="2025-11-22 03:08:29.284254355 +0000 UTC m=+873.590271668" watchObservedRunningTime="2025-11-22 03:08:29.291214453 +0000 UTC m=+873.597231736" Nov 22 03:08:33 crc kubenswrapper[4952]: I1122 03:08:33.279467 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-g7gzn" Nov 22 03:08:33 crc kubenswrapper[4952]: I1122 03:08:33.421085 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-29fxl" Nov 22 03:08:33 crc kubenswrapper[4952]: I1122 03:08:33.482960 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-8d2db" Nov 22 03:08:33 crc kubenswrapper[4952]: I1122 03:08:33.483957 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-58f887965d-dkhsn" Nov 22 03:08:33 crc kubenswrapper[4952]: I1122 03:08:33.565632 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-qw49c" Nov 22 03:08:33 crc kubenswrapper[4952]: I1122 03:08:33.745082 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-wnb2k" Nov 22 03:08:33 crc kubenswrapper[4952]: I1122 03:08:33.760724 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-vlsq4" Nov 22 03:08:33 crc kubenswrapper[4952]: I1122 03:08:33.815218 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-gr8vb" Nov 22 03:08:33 crc kubenswrapper[4952]: I1122 03:08:33.853162 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-68bjm" Nov 22 03:08:34 crc kubenswrapper[4952]: I1122 03:08:34.124210 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-ftxgw" Nov 22 03:08:34 crc kubenswrapper[4952]: I1122 03:08:34.168945 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-b4c496f69-hb768" Nov 22 03:08:34 crc kubenswrapper[4952]: I1122 03:08:34.652384 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-2vq4f" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.270858 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-l6t5s"] Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.273162 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-l6t5s" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.277700 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.277733 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.277796 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.277850 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-jtgrd" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.290896 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n46d9\" (UniqueName: \"kubernetes.io/projected/0ab234df-237b-46e5-b663-dec7e72c4476-kube-api-access-n46d9\") pod \"dnsmasq-dns-675f4bcbfc-l6t5s\" (UID: \"0ab234df-237b-46e5-b663-dec7e72c4476\") " pod="openstack/dnsmasq-dns-675f4bcbfc-l6t5s" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.291111 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ab234df-237b-46e5-b663-dec7e72c4476-config\") pod \"dnsmasq-dns-675f4bcbfc-l6t5s\" (UID: \"0ab234df-237b-46e5-b663-dec7e72c4476\") " pod="openstack/dnsmasq-dns-675f4bcbfc-l6t5s" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.295613 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-l6t5s"] Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.328939 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xfslp"] Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.330474 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-xfslp" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.332658 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.348847 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xfslp"] Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.392186 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2353f8d1-e2bf-488c-9f9c-08147b19b8a3-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-xfslp\" (UID: \"2353f8d1-e2bf-488c-9f9c-08147b19b8a3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xfslp" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.392244 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jshdl\" (UniqueName: \"kubernetes.io/projected/2353f8d1-e2bf-488c-9f9c-08147b19b8a3-kube-api-access-jshdl\") pod \"dnsmasq-dns-78dd6ddcc-xfslp\" (UID: \"2353f8d1-e2bf-488c-9f9c-08147b19b8a3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xfslp" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.392273 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ab234df-237b-46e5-b663-dec7e72c4476-config\") pod \"dnsmasq-dns-675f4bcbfc-l6t5s\" (UID: \"0ab234df-237b-46e5-b663-dec7e72c4476\") " pod="openstack/dnsmasq-dns-675f4bcbfc-l6t5s" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.392306 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2353f8d1-e2bf-488c-9f9c-08147b19b8a3-config\") pod \"dnsmasq-dns-78dd6ddcc-xfslp\" (UID: \"2353f8d1-e2bf-488c-9f9c-08147b19b8a3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xfslp" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.392324 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n46d9\" (UniqueName: \"kubernetes.io/projected/0ab234df-237b-46e5-b663-dec7e72c4476-kube-api-access-n46d9\") pod \"dnsmasq-dns-675f4bcbfc-l6t5s\" (UID: \"0ab234df-237b-46e5-b663-dec7e72c4476\") " pod="openstack/dnsmasq-dns-675f4bcbfc-l6t5s" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.394110 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ab234df-237b-46e5-b663-dec7e72c4476-config\") pod \"dnsmasq-dns-675f4bcbfc-l6t5s\" (UID: \"0ab234df-237b-46e5-b663-dec7e72c4476\") " pod="openstack/dnsmasq-dns-675f4bcbfc-l6t5s" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.418028 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n46d9\" (UniqueName: \"kubernetes.io/projected/0ab234df-237b-46e5-b663-dec7e72c4476-kube-api-access-n46d9\") pod \"dnsmasq-dns-675f4bcbfc-l6t5s\" (UID: \"0ab234df-237b-46e5-b663-dec7e72c4476\") " pod="openstack/dnsmasq-dns-675f4bcbfc-l6t5s" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.493480 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2353f8d1-e2bf-488c-9f9c-08147b19b8a3-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-xfslp\" (UID: \"2353f8d1-e2bf-488c-9f9c-08147b19b8a3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xfslp" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.493988 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jshdl\" (UniqueName: \"kubernetes.io/projected/2353f8d1-e2bf-488c-9f9c-08147b19b8a3-kube-api-access-jshdl\") pod \"dnsmasq-dns-78dd6ddcc-xfslp\" (UID: \"2353f8d1-e2bf-488c-9f9c-08147b19b8a3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xfslp" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.494036 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2353f8d1-e2bf-488c-9f9c-08147b19b8a3-config\") pod \"dnsmasq-dns-78dd6ddcc-xfslp\" (UID: \"2353f8d1-e2bf-488c-9f9c-08147b19b8a3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xfslp" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.495152 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2353f8d1-e2bf-488c-9f9c-08147b19b8a3-config\") pod \"dnsmasq-dns-78dd6ddcc-xfslp\" (UID: \"2353f8d1-e2bf-488c-9f9c-08147b19b8a3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xfslp" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.495589 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2353f8d1-e2bf-488c-9f9c-08147b19b8a3-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-xfslp\" (UID: \"2353f8d1-e2bf-488c-9f9c-08147b19b8a3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xfslp" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.513060 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jshdl\" (UniqueName: \"kubernetes.io/projected/2353f8d1-e2bf-488c-9f9c-08147b19b8a3-kube-api-access-jshdl\") pod \"dnsmasq-dns-78dd6ddcc-xfslp\" (UID: \"2353f8d1-e2bf-488c-9f9c-08147b19b8a3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xfslp" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.593256 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-l6t5s" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.646100 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-xfslp" Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.926427 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xfslp"] Nov 22 03:08:52 crc kubenswrapper[4952]: W1122 03:08:52.930814 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2353f8d1_e2bf_488c_9f9c_08147b19b8a3.slice/crio-120c959276a2435403389c26f9fa6f8c0cf57bb7e0138b40c8b96b1689c2ce6b WatchSource:0}: Error finding container 120c959276a2435403389c26f9fa6f8c0cf57bb7e0138b40c8b96b1689c2ce6b: Status 404 returned error can't find the container with id 120c959276a2435403389c26f9fa6f8c0cf57bb7e0138b40c8b96b1689c2ce6b Nov 22 03:08:52 crc kubenswrapper[4952]: I1122 03:08:52.935772 4952 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 03:08:53 crc kubenswrapper[4952]: I1122 03:08:53.091865 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-l6t5s"] Nov 22 03:08:53 crc kubenswrapper[4952]: I1122 03:08:53.487291 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-xfslp" event={"ID":"2353f8d1-e2bf-488c-9f9c-08147b19b8a3","Type":"ContainerStarted","Data":"120c959276a2435403389c26f9fa6f8c0cf57bb7e0138b40c8b96b1689c2ce6b"} Nov 22 03:08:53 crc kubenswrapper[4952]: I1122 03:08:53.488902 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-l6t5s" event={"ID":"0ab234df-237b-46e5-b663-dec7e72c4476","Type":"ContainerStarted","Data":"7d10f37c56f01e9c73c2c7f8a98bc11deb8df08476a7553135e6d9f6c650188e"} Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.545234 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-l6t5s"] Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.571137 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bbm99"] Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.572378 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-bbm99" Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.588067 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bbm99"] Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.658735 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f52b66b6-7362-448b-a597-0c71381a4fa0-dns-svc\") pod \"dnsmasq-dns-666b6646f7-bbm99\" (UID: \"f52b66b6-7362-448b-a597-0c71381a4fa0\") " pod="openstack/dnsmasq-dns-666b6646f7-bbm99" Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.658838 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f52b66b6-7362-448b-a597-0c71381a4fa0-config\") pod \"dnsmasq-dns-666b6646f7-bbm99\" (UID: \"f52b66b6-7362-448b-a597-0c71381a4fa0\") " pod="openstack/dnsmasq-dns-666b6646f7-bbm99" Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.658867 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kgh7\" (UniqueName: \"kubernetes.io/projected/f52b66b6-7362-448b-a597-0c71381a4fa0-kube-api-access-6kgh7\") pod \"dnsmasq-dns-666b6646f7-bbm99\" (UID: \"f52b66b6-7362-448b-a597-0c71381a4fa0\") " pod="openstack/dnsmasq-dns-666b6646f7-bbm99" Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.759839 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f52b66b6-7362-448b-a597-0c71381a4fa0-config\") pod \"dnsmasq-dns-666b6646f7-bbm99\" (UID: \"f52b66b6-7362-448b-a597-0c71381a4fa0\") " pod="openstack/dnsmasq-dns-666b6646f7-bbm99" Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.759889 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kgh7\" (UniqueName: \"kubernetes.io/projected/f52b66b6-7362-448b-a597-0c71381a4fa0-kube-api-access-6kgh7\") pod \"dnsmasq-dns-666b6646f7-bbm99\" (UID: \"f52b66b6-7362-448b-a597-0c71381a4fa0\") " pod="openstack/dnsmasq-dns-666b6646f7-bbm99" Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.759959 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f52b66b6-7362-448b-a597-0c71381a4fa0-dns-svc\") pod \"dnsmasq-dns-666b6646f7-bbm99\" (UID: \"f52b66b6-7362-448b-a597-0c71381a4fa0\") " pod="openstack/dnsmasq-dns-666b6646f7-bbm99" Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.763151 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f52b66b6-7362-448b-a597-0c71381a4fa0-dns-svc\") pod \"dnsmasq-dns-666b6646f7-bbm99\" (UID: \"f52b66b6-7362-448b-a597-0c71381a4fa0\") " pod="openstack/dnsmasq-dns-666b6646f7-bbm99" Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.790980 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f52b66b6-7362-448b-a597-0c71381a4fa0-config\") pod \"dnsmasq-dns-666b6646f7-bbm99\" (UID: \"f52b66b6-7362-448b-a597-0c71381a4fa0\") " pod="openstack/dnsmasq-dns-666b6646f7-bbm99" Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.820956 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kgh7\" (UniqueName: \"kubernetes.io/projected/f52b66b6-7362-448b-a597-0c71381a4fa0-kube-api-access-6kgh7\") pod \"dnsmasq-dns-666b6646f7-bbm99\" (UID: \"f52b66b6-7362-448b-a597-0c71381a4fa0\") " pod="openstack/dnsmasq-dns-666b6646f7-bbm99" Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.895913 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-bbm99" Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.898522 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xfslp"] Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.936121 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qtwxh"] Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.943178 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qtwxh"] Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.943485 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-qtwxh" Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.965682 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120e59b6-3c22-4ae8-874c-127e794a328f-config\") pod \"dnsmasq-dns-57d769cc4f-qtwxh\" (UID: \"120e59b6-3c22-4ae8-874c-127e794a328f\") " pod="openstack/dnsmasq-dns-57d769cc4f-qtwxh" Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.965932 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpwpr\" (UniqueName: \"kubernetes.io/projected/120e59b6-3c22-4ae8-874c-127e794a328f-kube-api-access-rpwpr\") pod \"dnsmasq-dns-57d769cc4f-qtwxh\" (UID: \"120e59b6-3c22-4ae8-874c-127e794a328f\") " pod="openstack/dnsmasq-dns-57d769cc4f-qtwxh" Nov 22 03:08:55 crc kubenswrapper[4952]: I1122 03:08:55.966052 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/120e59b6-3c22-4ae8-874c-127e794a328f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-qtwxh\" (UID: \"120e59b6-3c22-4ae8-874c-127e794a328f\") " pod="openstack/dnsmasq-dns-57d769cc4f-qtwxh" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.066827 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120e59b6-3c22-4ae8-874c-127e794a328f-config\") pod \"dnsmasq-dns-57d769cc4f-qtwxh\" (UID: \"120e59b6-3c22-4ae8-874c-127e794a328f\") " pod="openstack/dnsmasq-dns-57d769cc4f-qtwxh" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.066920 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpwpr\" (UniqueName: \"kubernetes.io/projected/120e59b6-3c22-4ae8-874c-127e794a328f-kube-api-access-rpwpr\") pod \"dnsmasq-dns-57d769cc4f-qtwxh\" (UID: \"120e59b6-3c22-4ae8-874c-127e794a328f\") " pod="openstack/dnsmasq-dns-57d769cc4f-qtwxh" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.066968 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/120e59b6-3c22-4ae8-874c-127e794a328f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-qtwxh\" (UID: \"120e59b6-3c22-4ae8-874c-127e794a328f\") " pod="openstack/dnsmasq-dns-57d769cc4f-qtwxh" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.068140 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120e59b6-3c22-4ae8-874c-127e794a328f-config\") pod \"dnsmasq-dns-57d769cc4f-qtwxh\" (UID: \"120e59b6-3c22-4ae8-874c-127e794a328f\") " pod="openstack/dnsmasq-dns-57d769cc4f-qtwxh" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.068372 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/120e59b6-3c22-4ae8-874c-127e794a328f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-qtwxh\" (UID: \"120e59b6-3c22-4ae8-874c-127e794a328f\") " pod="openstack/dnsmasq-dns-57d769cc4f-qtwxh" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.091386 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpwpr\" (UniqueName: \"kubernetes.io/projected/120e59b6-3c22-4ae8-874c-127e794a328f-kube-api-access-rpwpr\") pod \"dnsmasq-dns-57d769cc4f-qtwxh\" (UID: \"120e59b6-3c22-4ae8-874c-127e794a328f\") " pod="openstack/dnsmasq-dns-57d769cc4f-qtwxh" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.268028 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-qtwxh" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.708363 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.710144 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.716476 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.716795 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.718483 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-pnpt7" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.720147 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.720164 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.720163 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.720183 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.734368 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.879502 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.879698 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.879764 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74351431-f23e-45c3-a8a5-08143737551a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.879815 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74351431-f23e-45c3-a8a5-08143737551a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.879876 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.879937 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74351431-f23e-45c3-a8a5-08143737551a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.880015 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cgj6\" (UniqueName: \"kubernetes.io/projected/74351431-f23e-45c3-a8a5-08143737551a-kube-api-access-8cgj6\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.880050 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.880106 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.880202 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74351431-f23e-45c3-a8a5-08143737551a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.880302 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74351431-f23e-45c3-a8a5-08143737551a-config-data\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.981509 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74351431-f23e-45c3-a8a5-08143737551a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.981583 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cgj6\" (UniqueName: \"kubernetes.io/projected/74351431-f23e-45c3-a8a5-08143737551a-kube-api-access-8cgj6\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.981611 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.981628 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.981643 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74351431-f23e-45c3-a8a5-08143737551a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.981665 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74351431-f23e-45c3-a8a5-08143737551a-config-data\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.981692 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.981743 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.981762 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74351431-f23e-45c3-a8a5-08143737551a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.981792 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74351431-f23e-45c3-a8a5-08143737551a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.981808 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.982248 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.983238 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74351431-f23e-45c3-a8a5-08143737551a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.984143 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.984173 4952 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.984360 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74351431-f23e-45c3-a8a5-08143737551a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.985363 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74351431-f23e-45c3-a8a5-08143737551a-config-data\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.989936 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74351431-f23e-45c3-a8a5-08143737551a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.990039 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.990270 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:56 crc kubenswrapper[4952]: I1122 03:08:56.996782 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74351431-f23e-45c3-a8a5-08143737551a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.002067 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cgj6\" (UniqueName: \"kubernetes.io/projected/74351431-f23e-45c3-a8a5-08143737551a-kube-api-access-8cgj6\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.015684 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.028137 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.029744 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.032929 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.036527 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.036792 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.036936 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-79mtl" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.037296 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.039970 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.040354 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.049316 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.049905 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.186751 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/16617513-df98-4123-b612-9bc83023f977-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.186815 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16617513-df98-4123-b612-9bc83023f977-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.186891 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/16617513-df98-4123-b612-9bc83023f977-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.187017 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjvdp\" (UniqueName: \"kubernetes.io/projected/16617513-df98-4123-b612-9bc83023f977-kube-api-access-tjvdp\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.187073 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.187121 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/16617513-df98-4123-b612-9bc83023f977-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.187165 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/16617513-df98-4123-b612-9bc83023f977-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.187219 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/16617513-df98-4123-b612-9bc83023f977-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.187285 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/16617513-df98-4123-b612-9bc83023f977-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.187447 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/16617513-df98-4123-b612-9bc83023f977-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.187587 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/16617513-df98-4123-b612-9bc83023f977-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.288990 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/16617513-df98-4123-b612-9bc83023f977-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.289092 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/16617513-df98-4123-b612-9bc83023f977-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.289118 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16617513-df98-4123-b612-9bc83023f977-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.293626 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/16617513-df98-4123-b612-9bc83023f977-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.293745 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjvdp\" (UniqueName: \"kubernetes.io/projected/16617513-df98-4123-b612-9bc83023f977-kube-api-access-tjvdp\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.293784 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.293816 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/16617513-df98-4123-b612-9bc83023f977-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.293853 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/16617513-df98-4123-b612-9bc83023f977-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.293881 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/16617513-df98-4123-b612-9bc83023f977-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.293918 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/16617513-df98-4123-b612-9bc83023f977-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.294011 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/16617513-df98-4123-b612-9bc83023f977-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.297760 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/16617513-df98-4123-b612-9bc83023f977-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.298443 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/16617513-df98-4123-b612-9bc83023f977-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.298476 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/16617513-df98-4123-b612-9bc83023f977-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.298941 4952 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.299190 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16617513-df98-4123-b612-9bc83023f977-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.299479 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/16617513-df98-4123-b612-9bc83023f977-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.299756 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/16617513-df98-4123-b612-9bc83023f977-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.305837 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/16617513-df98-4123-b612-9bc83023f977-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.316252 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/16617513-df98-4123-b612-9bc83023f977-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.327187 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/16617513-df98-4123-b612-9bc83023f977-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.328363 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.334647 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjvdp\" (UniqueName: \"kubernetes.io/projected/16617513-df98-4123-b612-9bc83023f977-kube-api-access-tjvdp\") pod \"rabbitmq-cell1-server-0\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:57 crc kubenswrapper[4952]: I1122 03:08:57.405561 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.341902 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.342221 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.516325 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.521168 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.524035 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.524345 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.524782 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.524953 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-lhqj7" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.534662 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.539713 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.616863 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a3eb772-8262-4b28-873f-63f00885054d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.616941 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3eb772-8262-4b28-873f-63f00885054d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.617275 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a3eb772-8262-4b28-873f-63f00885054d-config-data-default\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.617411 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a3eb772-8262-4b28-873f-63f00885054d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.617657 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a3eb772-8262-4b28-873f-63f00885054d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.617726 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94xrd\" (UniqueName: \"kubernetes.io/projected/9a3eb772-8262-4b28-873f-63f00885054d-kube-api-access-94xrd\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.617809 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a3eb772-8262-4b28-873f-63f00885054d-kolla-config\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.617896 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.719166 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a3eb772-8262-4b28-873f-63f00885054d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.719238 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3eb772-8262-4b28-873f-63f00885054d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.719289 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a3eb772-8262-4b28-873f-63f00885054d-config-data-default\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.719320 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a3eb772-8262-4b28-873f-63f00885054d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.719472 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a3eb772-8262-4b28-873f-63f00885054d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.719506 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94xrd\" (UniqueName: \"kubernetes.io/projected/9a3eb772-8262-4b28-873f-63f00885054d-kube-api-access-94xrd\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.719536 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a3eb772-8262-4b28-873f-63f00885054d-kolla-config\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.719588 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.719987 4952 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.721414 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a3eb772-8262-4b28-873f-63f00885054d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.722081 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a3eb772-8262-4b28-873f-63f00885054d-config-data-default\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.722372 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a3eb772-8262-4b28-873f-63f00885054d-kolla-config\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.722678 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a3eb772-8262-4b28-873f-63f00885054d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.732860 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3eb772-8262-4b28-873f-63f00885054d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.735690 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a3eb772-8262-4b28-873f-63f00885054d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.740254 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94xrd\" (UniqueName: \"kubernetes.io/projected/9a3eb772-8262-4b28-873f-63f00885054d-kube-api-access-94xrd\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.745232 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"9a3eb772-8262-4b28-873f-63f00885054d\") " pod="openstack/openstack-galera-0" Nov 22 03:08:58 crc kubenswrapper[4952]: I1122 03:08:58.860685 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 22 03:08:59 crc kubenswrapper[4952]: I1122 03:08:59.961138 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 03:08:59 crc kubenswrapper[4952]: I1122 03:08:59.963180 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:59 crc kubenswrapper[4952]: I1122 03:08:59.969607 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-kfqrg" Nov 22 03:08:59 crc kubenswrapper[4952]: I1122 03:08:59.971100 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 22 03:08:59 crc kubenswrapper[4952]: I1122 03:08:59.971348 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 22 03:08:59 crc kubenswrapper[4952]: I1122 03:08:59.972267 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 22 03:08:59 crc kubenswrapper[4952]: I1122 03:08:59.977600 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.045136 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f098b30-65ee-4b19-9674-c384bddf0832-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.045219 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0f098b30-65ee-4b19-9674-c384bddf0832-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.045261 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0f098b30-65ee-4b19-9674-c384bddf0832-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.045296 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f098b30-65ee-4b19-9674-c384bddf0832-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.045327 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x8kw\" (UniqueName: \"kubernetes.io/projected/0f098b30-65ee-4b19-9674-c384bddf0832-kube-api-access-7x8kw\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.045360 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.045414 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f098b30-65ee-4b19-9674-c384bddf0832-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.045455 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0f098b30-65ee-4b19-9674-c384bddf0832-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.146563 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.147038 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f098b30-65ee-4b19-9674-c384bddf0832-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.147083 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0f098b30-65ee-4b19-9674-c384bddf0832-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.147117 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f098b30-65ee-4b19-9674-c384bddf0832-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.147146 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0f098b30-65ee-4b19-9674-c384bddf0832-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.147176 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0f098b30-65ee-4b19-9674-c384bddf0832-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.147195 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f098b30-65ee-4b19-9674-c384bddf0832-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.147219 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x8kw\" (UniqueName: \"kubernetes.io/projected/0f098b30-65ee-4b19-9674-c384bddf0832-kube-api-access-7x8kw\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.147330 4952 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.150152 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0f098b30-65ee-4b19-9674-c384bddf0832-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.150631 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0f098b30-65ee-4b19-9674-c384bddf0832-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.150954 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0f098b30-65ee-4b19-9674-c384bddf0832-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.152382 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f098b30-65ee-4b19-9674-c384bddf0832-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.160771 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.162101 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.171332 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f098b30-65ee-4b19-9674-c384bddf0832-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.171859 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f098b30-65ee-4b19-9674-c384bddf0832-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.171882 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.172818 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.174303 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-gtntc" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.175475 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.176942 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x8kw\" (UniqueName: \"kubernetes.io/projected/0f098b30-65ee-4b19-9674-c384bddf0832-kube-api-access-7x8kw\") pod \"openstack-cell1-galera-0\" (UID: \"0f098b30-65ee-4b19-9674-c384bddf0832\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.178924 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.249225 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/169f3305-945d-47b9-8764-6e37ee8863e0-config-data\") pod \"memcached-0\" (UID: \"169f3305-945d-47b9-8764-6e37ee8863e0\") " pod="openstack/memcached-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.249608 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/169f3305-945d-47b9-8764-6e37ee8863e0-kolla-config\") pod \"memcached-0\" (UID: \"169f3305-945d-47b9-8764-6e37ee8863e0\") " pod="openstack/memcached-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.249787 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/169f3305-945d-47b9-8764-6e37ee8863e0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"169f3305-945d-47b9-8764-6e37ee8863e0\") " pod="openstack/memcached-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.249968 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/169f3305-945d-47b9-8764-6e37ee8863e0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"169f3305-945d-47b9-8764-6e37ee8863e0\") " pod="openstack/memcached-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.250072 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnl5j\" (UniqueName: \"kubernetes.io/projected/169f3305-945d-47b9-8764-6e37ee8863e0-kube-api-access-rnl5j\") pod \"memcached-0\" (UID: \"169f3305-945d-47b9-8764-6e37ee8863e0\") " pod="openstack/memcached-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.299243 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.352836 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/169f3305-945d-47b9-8764-6e37ee8863e0-config-data\") pod \"memcached-0\" (UID: \"169f3305-945d-47b9-8764-6e37ee8863e0\") " pod="openstack/memcached-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.353077 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/169f3305-945d-47b9-8764-6e37ee8863e0-kolla-config\") pod \"memcached-0\" (UID: \"169f3305-945d-47b9-8764-6e37ee8863e0\") " pod="openstack/memcached-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.353131 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/169f3305-945d-47b9-8764-6e37ee8863e0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"169f3305-945d-47b9-8764-6e37ee8863e0\") " pod="openstack/memcached-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.353283 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/169f3305-945d-47b9-8764-6e37ee8863e0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"169f3305-945d-47b9-8764-6e37ee8863e0\") " pod="openstack/memcached-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.353413 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnl5j\" (UniqueName: \"kubernetes.io/projected/169f3305-945d-47b9-8764-6e37ee8863e0-kube-api-access-rnl5j\") pod \"memcached-0\" (UID: \"169f3305-945d-47b9-8764-6e37ee8863e0\") " pod="openstack/memcached-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.353769 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/169f3305-945d-47b9-8764-6e37ee8863e0-config-data\") pod \"memcached-0\" (UID: \"169f3305-945d-47b9-8764-6e37ee8863e0\") " pod="openstack/memcached-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.354087 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/169f3305-945d-47b9-8764-6e37ee8863e0-kolla-config\") pod \"memcached-0\" (UID: \"169f3305-945d-47b9-8764-6e37ee8863e0\") " pod="openstack/memcached-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.357637 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/169f3305-945d-47b9-8764-6e37ee8863e0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"169f3305-945d-47b9-8764-6e37ee8863e0\") " pod="openstack/memcached-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.357911 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/169f3305-945d-47b9-8764-6e37ee8863e0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"169f3305-945d-47b9-8764-6e37ee8863e0\") " pod="openstack/memcached-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.383277 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnl5j\" (UniqueName: \"kubernetes.io/projected/169f3305-945d-47b9-8764-6e37ee8863e0-kube-api-access-rnl5j\") pod \"memcached-0\" (UID: \"169f3305-945d-47b9-8764-6e37ee8863e0\") " pod="openstack/memcached-0" Nov 22 03:09:00 crc kubenswrapper[4952]: I1122 03:09:00.537755 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 22 03:09:02 crc kubenswrapper[4952]: I1122 03:09:02.168735 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 03:09:02 crc kubenswrapper[4952]: I1122 03:09:02.170350 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 03:09:02 crc kubenswrapper[4952]: I1122 03:09:02.172716 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-gp48k" Nov 22 03:09:02 crc kubenswrapper[4952]: I1122 03:09:02.186406 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 03:09:02 crc kubenswrapper[4952]: I1122 03:09:02.290028 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmg8d\" (UniqueName: \"kubernetes.io/projected/11ba6f7e-b307-4bd0-9f84-46bcf5721c38-kube-api-access-pmg8d\") pod \"kube-state-metrics-0\" (UID: \"11ba6f7e-b307-4bd0-9f84-46bcf5721c38\") " pod="openstack/kube-state-metrics-0" Nov 22 03:09:02 crc kubenswrapper[4952]: I1122 03:09:02.391797 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmg8d\" (UniqueName: \"kubernetes.io/projected/11ba6f7e-b307-4bd0-9f84-46bcf5721c38-kube-api-access-pmg8d\") pod \"kube-state-metrics-0\" (UID: \"11ba6f7e-b307-4bd0-9f84-46bcf5721c38\") " pod="openstack/kube-state-metrics-0" Nov 22 03:09:02 crc kubenswrapper[4952]: I1122 03:09:02.415952 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmg8d\" (UniqueName: \"kubernetes.io/projected/11ba6f7e-b307-4bd0-9f84-46bcf5721c38-kube-api-access-pmg8d\") pod \"kube-state-metrics-0\" (UID: \"11ba6f7e-b307-4bd0-9f84-46bcf5721c38\") " pod="openstack/kube-state-metrics-0" Nov 22 03:09:02 crc kubenswrapper[4952]: I1122 03:09:02.502107 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.275297 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9kcvz"] Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.277822 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.280319 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.280620 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-f6ssz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.280725 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.313811 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9kcvz"] Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.363793 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/82bf89be-221b-4963-bfa4-794e0eb978c6-var-log-ovn\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.364132 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/82bf89be-221b-4963-bfa4-794e0eb978c6-var-run-ovn\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.364269 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/82bf89be-221b-4963-bfa4-794e0eb978c6-var-run\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.364408 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg9sz\" (UniqueName: \"kubernetes.io/projected/82bf89be-221b-4963-bfa4-794e0eb978c6-kube-api-access-vg9sz\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.364522 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82bf89be-221b-4963-bfa4-794e0eb978c6-combined-ca-bundle\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.364709 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82bf89be-221b-4963-bfa4-794e0eb978c6-scripts\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.364812 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/82bf89be-221b-4963-bfa4-794e0eb978c6-ovn-controller-tls-certs\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.367055 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-blvps"] Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.368932 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.402621 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-blvps"] Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.466806 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82bf89be-221b-4963-bfa4-794e0eb978c6-combined-ca-bundle\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.467174 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36a574de-e2a6-4711-82ae-a7ffc34ef5fd-scripts\") pod \"ovn-controller-ovs-blvps\" (UID: \"36a574de-e2a6-4711-82ae-a7ffc34ef5fd\") " pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.467350 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v52gl\" (UniqueName: \"kubernetes.io/projected/36a574de-e2a6-4711-82ae-a7ffc34ef5fd-kube-api-access-v52gl\") pod \"ovn-controller-ovs-blvps\" (UID: \"36a574de-e2a6-4711-82ae-a7ffc34ef5fd\") " pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.467423 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82bf89be-221b-4963-bfa4-794e0eb978c6-scripts\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.467449 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/36a574de-e2a6-4711-82ae-a7ffc34ef5fd-var-lib\") pod \"ovn-controller-ovs-blvps\" (UID: \"36a574de-e2a6-4711-82ae-a7ffc34ef5fd\") " pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.467566 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/82bf89be-221b-4963-bfa4-794e0eb978c6-ovn-controller-tls-certs\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.467766 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/82bf89be-221b-4963-bfa4-794e0eb978c6-var-log-ovn\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.467795 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/36a574de-e2a6-4711-82ae-a7ffc34ef5fd-var-log\") pod \"ovn-controller-ovs-blvps\" (UID: \"36a574de-e2a6-4711-82ae-a7ffc34ef5fd\") " pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.467818 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/82bf89be-221b-4963-bfa4-794e0eb978c6-var-run-ovn\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.467842 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/82bf89be-221b-4963-bfa4-794e0eb978c6-var-run\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.467870 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg9sz\" (UniqueName: \"kubernetes.io/projected/82bf89be-221b-4963-bfa4-794e0eb978c6-kube-api-access-vg9sz\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.467921 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/36a574de-e2a6-4711-82ae-a7ffc34ef5fd-etc-ovs\") pod \"ovn-controller-ovs-blvps\" (UID: \"36a574de-e2a6-4711-82ae-a7ffc34ef5fd\") " pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.467953 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36a574de-e2a6-4711-82ae-a7ffc34ef5fd-var-run\") pod \"ovn-controller-ovs-blvps\" (UID: \"36a574de-e2a6-4711-82ae-a7ffc34ef5fd\") " pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.469314 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/82bf89be-221b-4963-bfa4-794e0eb978c6-var-log-ovn\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.469473 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/82bf89be-221b-4963-bfa4-794e0eb978c6-var-run\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.469653 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/82bf89be-221b-4963-bfa4-794e0eb978c6-var-run-ovn\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.470409 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82bf89be-221b-4963-bfa4-794e0eb978c6-scripts\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.486781 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82bf89be-221b-4963-bfa4-794e0eb978c6-combined-ca-bundle\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.486820 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/82bf89be-221b-4963-bfa4-794e0eb978c6-ovn-controller-tls-certs\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.504268 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg9sz\" (UniqueName: \"kubernetes.io/projected/82bf89be-221b-4963-bfa4-794e0eb978c6-kube-api-access-vg9sz\") pod \"ovn-controller-9kcvz\" (UID: \"82bf89be-221b-4963-bfa4-794e0eb978c6\") " pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.570075 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36a574de-e2a6-4711-82ae-a7ffc34ef5fd-scripts\") pod \"ovn-controller-ovs-blvps\" (UID: \"36a574de-e2a6-4711-82ae-a7ffc34ef5fd\") " pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.570128 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v52gl\" (UniqueName: \"kubernetes.io/projected/36a574de-e2a6-4711-82ae-a7ffc34ef5fd-kube-api-access-v52gl\") pod \"ovn-controller-ovs-blvps\" (UID: \"36a574de-e2a6-4711-82ae-a7ffc34ef5fd\") " pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.570164 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/36a574de-e2a6-4711-82ae-a7ffc34ef5fd-var-lib\") pod \"ovn-controller-ovs-blvps\" (UID: \"36a574de-e2a6-4711-82ae-a7ffc34ef5fd\") " pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.570359 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/36a574de-e2a6-4711-82ae-a7ffc34ef5fd-var-log\") pod \"ovn-controller-ovs-blvps\" (UID: \"36a574de-e2a6-4711-82ae-a7ffc34ef5fd\") " pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.570413 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/36a574de-e2a6-4711-82ae-a7ffc34ef5fd-etc-ovs\") pod \"ovn-controller-ovs-blvps\" (UID: \"36a574de-e2a6-4711-82ae-a7ffc34ef5fd\") " pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.570449 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36a574de-e2a6-4711-82ae-a7ffc34ef5fd-var-run\") pod \"ovn-controller-ovs-blvps\" (UID: \"36a574de-e2a6-4711-82ae-a7ffc34ef5fd\") " pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.570609 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36a574de-e2a6-4711-82ae-a7ffc34ef5fd-var-run\") pod \"ovn-controller-ovs-blvps\" (UID: \"36a574de-e2a6-4711-82ae-a7ffc34ef5fd\") " pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.571206 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/36a574de-e2a6-4711-82ae-a7ffc34ef5fd-var-lib\") pod \"ovn-controller-ovs-blvps\" (UID: \"36a574de-e2a6-4711-82ae-a7ffc34ef5fd\") " pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.572247 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36a574de-e2a6-4711-82ae-a7ffc34ef5fd-scripts\") pod \"ovn-controller-ovs-blvps\" (UID: \"36a574de-e2a6-4711-82ae-a7ffc34ef5fd\") " pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.573063 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/36a574de-e2a6-4711-82ae-a7ffc34ef5fd-etc-ovs\") pod \"ovn-controller-ovs-blvps\" (UID: \"36a574de-e2a6-4711-82ae-a7ffc34ef5fd\") " pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.573208 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/36a574de-e2a6-4711-82ae-a7ffc34ef5fd-var-log\") pod \"ovn-controller-ovs-blvps\" (UID: \"36a574de-e2a6-4711-82ae-a7ffc34ef5fd\") " pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.601849 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v52gl\" (UniqueName: \"kubernetes.io/projected/36a574de-e2a6-4711-82ae-a7ffc34ef5fd-kube-api-access-v52gl\") pod \"ovn-controller-ovs-blvps\" (UID: \"36a574de-e2a6-4711-82ae-a7ffc34ef5fd\") " pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.613815 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:05 crc kubenswrapper[4952]: I1122 03:09:05.698319 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:06 crc kubenswrapper[4952]: I1122 03:09:06.809689 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 03:09:06 crc kubenswrapper[4952]: I1122 03:09:06.811641 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:06 crc kubenswrapper[4952]: I1122 03:09:06.820034 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 22 03:09:06 crc kubenswrapper[4952]: I1122 03:09:06.820303 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 22 03:09:06 crc kubenswrapper[4952]: I1122 03:09:06.820981 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-vbhrt" Nov 22 03:09:06 crc kubenswrapper[4952]: I1122 03:09:06.821169 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 22 03:09:06 crc kubenswrapper[4952]: I1122 03:09:06.823836 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 22 03:09:06 crc kubenswrapper[4952]: I1122 03:09:06.828558 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 03:09:06 crc kubenswrapper[4952]: I1122 03:09:06.894615 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:06 crc kubenswrapper[4952]: I1122 03:09:06.894669 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcjfr\" (UniqueName: \"kubernetes.io/projected/dbd8adf1-9949-4500-83d9-dbcb3d42037f-kube-api-access-bcjfr\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:06 crc kubenswrapper[4952]: I1122 03:09:06.894698 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dbd8adf1-9949-4500-83d9-dbcb3d42037f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:06 crc kubenswrapper[4952]: I1122 03:09:06.894736 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbd8adf1-9949-4500-83d9-dbcb3d42037f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:06 crc kubenswrapper[4952]: I1122 03:09:06.894769 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd8adf1-9949-4500-83d9-dbcb3d42037f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:06 crc kubenswrapper[4952]: I1122 03:09:06.894807 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd8adf1-9949-4500-83d9-dbcb3d42037f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:06 crc kubenswrapper[4952]: I1122 03:09:06.894837 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd8adf1-9949-4500-83d9-dbcb3d42037f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:06 crc kubenswrapper[4952]: I1122 03:09:06.894886 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd8adf1-9949-4500-83d9-dbcb3d42037f-config\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:07 crc kubenswrapper[4952]: I1122 03:09:07.002996 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbd8adf1-9949-4500-83d9-dbcb3d42037f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:07 crc kubenswrapper[4952]: I1122 03:09:07.003105 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd8adf1-9949-4500-83d9-dbcb3d42037f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:07 crc kubenswrapper[4952]: I1122 03:09:07.003170 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd8adf1-9949-4500-83d9-dbcb3d42037f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:07 crc kubenswrapper[4952]: I1122 03:09:07.003204 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd8adf1-9949-4500-83d9-dbcb3d42037f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:07 crc kubenswrapper[4952]: I1122 03:09:07.003287 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd8adf1-9949-4500-83d9-dbcb3d42037f-config\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:07 crc kubenswrapper[4952]: I1122 03:09:07.003507 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:07 crc kubenswrapper[4952]: I1122 03:09:07.003538 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcjfr\" (UniqueName: \"kubernetes.io/projected/dbd8adf1-9949-4500-83d9-dbcb3d42037f-kube-api-access-bcjfr\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:07 crc kubenswrapper[4952]: I1122 03:09:07.003609 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dbd8adf1-9949-4500-83d9-dbcb3d42037f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:07 crc kubenswrapper[4952]: I1122 03:09:07.004024 4952 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:07 crc kubenswrapper[4952]: I1122 03:09:07.004363 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dbd8adf1-9949-4500-83d9-dbcb3d42037f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:07 crc kubenswrapper[4952]: I1122 03:09:07.004694 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbd8adf1-9949-4500-83d9-dbcb3d42037f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:07 crc kubenswrapper[4952]: I1122 03:09:07.011273 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd8adf1-9949-4500-83d9-dbcb3d42037f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:07 crc kubenswrapper[4952]: I1122 03:09:07.014225 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd8adf1-9949-4500-83d9-dbcb3d42037f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:07 crc kubenswrapper[4952]: I1122 03:09:07.017532 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd8adf1-9949-4500-83d9-dbcb3d42037f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:07 crc kubenswrapper[4952]: I1122 03:09:07.020508 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd8adf1-9949-4500-83d9-dbcb3d42037f-config\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:07 crc kubenswrapper[4952]: I1122 03:09:07.032921 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcjfr\" (UniqueName: \"kubernetes.io/projected/dbd8adf1-9949-4500-83d9-dbcb3d42037f-kube-api-access-bcjfr\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:07 crc kubenswrapper[4952]: I1122 03:09:07.051351 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dbd8adf1-9949-4500-83d9-dbcb3d42037f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:07 crc kubenswrapper[4952]: I1122 03:09:07.138034 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:08 crc kubenswrapper[4952]: E1122 03:09:08.916124 4952 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 22 03:09:08 crc kubenswrapper[4952]: E1122 03:09:08.916784 4952 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jshdl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-xfslp_openstack(2353f8d1-e2bf-488c-9f9c-08147b19b8a3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 03:09:08 crc kubenswrapper[4952]: E1122 03:09:08.918214 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-xfslp" podUID="2353f8d1-e2bf-488c-9f9c-08147b19b8a3" Nov 22 03:09:08 crc kubenswrapper[4952]: E1122 03:09:08.956702 4952 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 22 03:09:08 crc kubenswrapper[4952]: E1122 03:09:08.959023 4952 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n46d9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-l6t5s_openstack(0ab234df-237b-46e5-b663-dec7e72c4476): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 03:09:08 crc kubenswrapper[4952]: E1122 03:09:08.960781 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-l6t5s" podUID="0ab234df-237b-46e5-b663-dec7e72c4476" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.384128 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 03:09:09 crc kubenswrapper[4952]: W1122 03:09:09.388626 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74351431_f23e_45c3_a8a5_08143737551a.slice/crio-15e079ab10ccec60b459174d579111f87cabba60632fa99feb7199d34f507d4a WatchSource:0}: Error finding container 15e079ab10ccec60b459174d579111f87cabba60632fa99feb7199d34f507d4a: Status 404 returned error can't find the container with id 15e079ab10ccec60b459174d579111f87cabba60632fa99feb7199d34f507d4a Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.467007 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.470915 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.473604 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.475077 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.475227 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.476031 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-bw2f8" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.479651 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.552005 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/da62937c-7359-4ba3-bf95-ddc5545df677-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.552070 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da62937c-7359-4ba3-bf95-ddc5545df677-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.552183 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/da62937c-7359-4ba3-bf95-ddc5545df677-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.552269 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da62937c-7359-4ba3-bf95-ddc5545df677-config\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.552328 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fx8z\" (UniqueName: \"kubernetes.io/projected/da62937c-7359-4ba3-bf95-ddc5545df677-kube-api-access-4fx8z\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.552517 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.552648 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da62937c-7359-4ba3-bf95-ddc5545df677-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.552783 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/da62937c-7359-4ba3-bf95-ddc5545df677-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.644906 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qtwxh"] Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.654036 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da62937c-7359-4ba3-bf95-ddc5545df677-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.654120 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/da62937c-7359-4ba3-bf95-ddc5545df677-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.654179 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/da62937c-7359-4ba3-bf95-ddc5545df677-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.654229 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da62937c-7359-4ba3-bf95-ddc5545df677-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.654288 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/da62937c-7359-4ba3-bf95-ddc5545df677-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.654316 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da62937c-7359-4ba3-bf95-ddc5545df677-config\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.654350 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fx8z\" (UniqueName: \"kubernetes.io/projected/da62937c-7359-4ba3-bf95-ddc5545df677-kube-api-access-4fx8z\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.654392 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.654884 4952 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.655914 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.658591 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/da62937c-7359-4ba3-bf95-ddc5545df677-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: W1122 03:09:09.659170 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod120e59b6_3c22_4ae8_874c_127e794a328f.slice/crio-f48ab05ba62aeb9b0139c7465630bafc0fe3567bf7584887e6d68d00d51bd1f3 WatchSource:0}: Error finding container f48ab05ba62aeb9b0139c7465630bafc0fe3567bf7584887e6d68d00d51bd1f3: Status 404 returned error can't find the container with id f48ab05ba62aeb9b0139c7465630bafc0fe3567bf7584887e6d68d00d51bd1f3 Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.659522 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da62937c-7359-4ba3-bf95-ddc5545df677-config\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.664061 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da62937c-7359-4ba3-bf95-ddc5545df677-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.664788 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.671127 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da62937c-7359-4ba3-bf95-ddc5545df677-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.671722 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/da62937c-7359-4ba3-bf95-ddc5545df677-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.673220 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/da62937c-7359-4ba3-bf95-ddc5545df677-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.679305 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.683656 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fx8z\" (UniqueName: \"kubernetes.io/projected/da62937c-7359-4ba3-bf95-ddc5545df677-kube-api-access-4fx8z\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.685769 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9a3eb772-8262-4b28-873f-63f00885054d","Type":"ContainerStarted","Data":"ff186bd8760a31f7e498c77dd4505b76285ec780fea309bb1df977d2f80310a5"} Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.690826 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-qtwxh" event={"ID":"120e59b6-3c22-4ae8-874c-127e794a328f","Type":"ContainerStarted","Data":"f48ab05ba62aeb9b0139c7465630bafc0fe3567bf7584887e6d68d00d51bd1f3"} Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.692707 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74351431-f23e-45c3-a8a5-08143737551a","Type":"ContainerStarted","Data":"15e079ab10ccec60b459174d579111f87cabba60632fa99feb7199d34f507d4a"} Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.698071 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"da62937c-7359-4ba3-bf95-ddc5545df677\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.788835 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 03:09:09 crc kubenswrapper[4952]: W1122 03:09:09.791424 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11ba6f7e_b307_4bd0_9f84_46bcf5721c38.slice/crio-9710774eefc2dcbe8b9659ef22d1fa3dbb63633effe4955e710bfceec129f991 WatchSource:0}: Error finding container 9710774eefc2dcbe8b9659ef22d1fa3dbb63633effe4955e710bfceec129f991: Status 404 returned error can't find the container with id 9710774eefc2dcbe8b9659ef22d1fa3dbb63633effe4955e710bfceec129f991 Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.801310 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.816034 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.925409 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9kcvz"] Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.967079 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-blvps"] Nov 22 03:09:09 crc kubenswrapper[4952]: I1122 03:09:09.973080 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bbm99"] Nov 22 03:09:09 crc kubenswrapper[4952]: W1122 03:09:09.996289 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82bf89be_221b_4963_bfa4_794e0eb978c6.slice/crio-3bd14ba427bab8987a8fde6aff5cd95f6bd3e5323b94971ddd8d8664b84baaf2 WatchSource:0}: Error finding container 3bd14ba427bab8987a8fde6aff5cd95f6bd3e5323b94971ddd8d8664b84baaf2: Status 404 returned error can't find the container with id 3bd14ba427bab8987a8fde6aff5cd95f6bd3e5323b94971ddd8d8664b84baaf2 Nov 22 03:09:10 crc kubenswrapper[4952]: W1122 03:09:10.002193 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf52b66b6_7362_448b_a597_0c71381a4fa0.slice/crio-7fee7d86b173bfbba094fcf7addaa7b51304bb4cb0dcaf6d196f6bc647035344 WatchSource:0}: Error finding container 7fee7d86b173bfbba094fcf7addaa7b51304bb4cb0dcaf6d196f6bc647035344: Status 404 returned error can't find the container with id 7fee7d86b173bfbba094fcf7addaa7b51304bb4cb0dcaf6d196f6bc647035344 Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.130123 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.175758 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-l6t5s" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.207108 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-74dm2"] Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.209276 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-74dm2" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.211806 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.219464 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-74dm2"] Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.291857 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-xfslp" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.384975 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ab234df-237b-46e5-b663-dec7e72c4476-config" (OuterVolumeSpecName: "config") pod "0ab234df-237b-46e5-b663-dec7e72c4476" (UID: "0ab234df-237b-46e5-b663-dec7e72c4476"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.385090 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ab234df-237b-46e5-b663-dec7e72c4476-config\") pod \"0ab234df-237b-46e5-b663-dec7e72c4476\" (UID: \"0ab234df-237b-46e5-b663-dec7e72c4476\") " Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.385206 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n46d9\" (UniqueName: \"kubernetes.io/projected/0ab234df-237b-46e5-b663-dec7e72c4476-kube-api-access-n46d9\") pod \"0ab234df-237b-46e5-b663-dec7e72c4476\" (UID: \"0ab234df-237b-46e5-b663-dec7e72c4476\") " Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.385233 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jshdl\" (UniqueName: \"kubernetes.io/projected/2353f8d1-e2bf-488c-9f9c-08147b19b8a3-kube-api-access-jshdl\") pod \"2353f8d1-e2bf-488c-9f9c-08147b19b8a3\" (UID: \"2353f8d1-e2bf-488c-9f9c-08147b19b8a3\") " Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.386809 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2353f8d1-e2bf-488c-9f9c-08147b19b8a3-config\") pod \"2353f8d1-e2bf-488c-9f9c-08147b19b8a3\" (UID: \"2353f8d1-e2bf-488c-9f9c-08147b19b8a3\") " Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.386909 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2353f8d1-e2bf-488c-9f9c-08147b19b8a3-dns-svc\") pod \"2353f8d1-e2bf-488c-9f9c-08147b19b8a3\" (UID: \"2353f8d1-e2bf-488c-9f9c-08147b19b8a3\") " Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.387249 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2610ac31-191b-4b34-8b6c-2362a88d2e40-combined-ca-bundle\") pod \"ovn-controller-metrics-74dm2\" (UID: \"2610ac31-191b-4b34-8b6c-2362a88d2e40\") " pod="openstack/ovn-controller-metrics-74dm2" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.387315 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2610ac31-191b-4b34-8b6c-2362a88d2e40-ovn-rundir\") pod \"ovn-controller-metrics-74dm2\" (UID: \"2610ac31-191b-4b34-8b6c-2362a88d2e40\") " pod="openstack/ovn-controller-metrics-74dm2" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.387387 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2610ac31-191b-4b34-8b6c-2362a88d2e40-config\") pod \"ovn-controller-metrics-74dm2\" (UID: \"2610ac31-191b-4b34-8b6c-2362a88d2e40\") " pod="openstack/ovn-controller-metrics-74dm2" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.387452 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2610ac31-191b-4b34-8b6c-2362a88d2e40-ovs-rundir\") pod \"ovn-controller-metrics-74dm2\" (UID: \"2610ac31-191b-4b34-8b6c-2362a88d2e40\") " pod="openstack/ovn-controller-metrics-74dm2" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.387508 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2610ac31-191b-4b34-8b6c-2362a88d2e40-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-74dm2\" (UID: \"2610ac31-191b-4b34-8b6c-2362a88d2e40\") " pod="openstack/ovn-controller-metrics-74dm2" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.387532 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjxdv\" (UniqueName: \"kubernetes.io/projected/2610ac31-191b-4b34-8b6c-2362a88d2e40-kube-api-access-kjxdv\") pod \"ovn-controller-metrics-74dm2\" (UID: \"2610ac31-191b-4b34-8b6c-2362a88d2e40\") " pod="openstack/ovn-controller-metrics-74dm2" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.387736 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ab234df-237b-46e5-b663-dec7e72c4476-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.395120 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab234df-237b-46e5-b663-dec7e72c4476-kube-api-access-n46d9" (OuterVolumeSpecName: "kube-api-access-n46d9") pod "0ab234df-237b-46e5-b663-dec7e72c4476" (UID: "0ab234df-237b-46e5-b663-dec7e72c4476"). InnerVolumeSpecName "kube-api-access-n46d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.395742 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2353f8d1-e2bf-488c-9f9c-08147b19b8a3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2353f8d1-e2bf-488c-9f9c-08147b19b8a3" (UID: "2353f8d1-e2bf-488c-9f9c-08147b19b8a3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.395918 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2353f8d1-e2bf-488c-9f9c-08147b19b8a3-kube-api-access-jshdl" (OuterVolumeSpecName: "kube-api-access-jshdl") pod "2353f8d1-e2bf-488c-9f9c-08147b19b8a3" (UID: "2353f8d1-e2bf-488c-9f9c-08147b19b8a3"). InnerVolumeSpecName "kube-api-access-jshdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.398397 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qtwxh"] Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.400339 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2353f8d1-e2bf-488c-9f9c-08147b19b8a3-config" (OuterVolumeSpecName: "config") pod "2353f8d1-e2bf-488c-9f9c-08147b19b8a3" (UID: "2353f8d1-e2bf-488c-9f9c-08147b19b8a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.424997 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-jgf6r"] Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.429356 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.483484 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.485972 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-jgf6r"] Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.490294 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2610ac31-191b-4b34-8b6c-2362a88d2e40-combined-ca-bundle\") pod \"ovn-controller-metrics-74dm2\" (UID: \"2610ac31-191b-4b34-8b6c-2362a88d2e40\") " pod="openstack/ovn-controller-metrics-74dm2" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.490353 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2610ac31-191b-4b34-8b6c-2362a88d2e40-ovn-rundir\") pod \"ovn-controller-metrics-74dm2\" (UID: \"2610ac31-191b-4b34-8b6c-2362a88d2e40\") " pod="openstack/ovn-controller-metrics-74dm2" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.490410 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/805bbb05-c886-4e37-b535-a5115099a741-config\") pod \"dnsmasq-dns-7fd796d7df-jgf6r\" (UID: \"805bbb05-c886-4e37-b535-a5115099a741\") " pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.490444 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2610ac31-191b-4b34-8b6c-2362a88d2e40-config\") pod \"ovn-controller-metrics-74dm2\" (UID: \"2610ac31-191b-4b34-8b6c-2362a88d2e40\") " pod="openstack/ovn-controller-metrics-74dm2" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.490480 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/805bbb05-c886-4e37-b535-a5115099a741-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-jgf6r\" (UID: \"805bbb05-c886-4e37-b535-a5115099a741\") " pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.490517 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzwrd\" (UniqueName: \"kubernetes.io/projected/805bbb05-c886-4e37-b535-a5115099a741-kube-api-access-fzwrd\") pod \"dnsmasq-dns-7fd796d7df-jgf6r\" (UID: \"805bbb05-c886-4e37-b535-a5115099a741\") " pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.490558 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2610ac31-191b-4b34-8b6c-2362a88d2e40-ovs-rundir\") pod \"ovn-controller-metrics-74dm2\" (UID: \"2610ac31-191b-4b34-8b6c-2362a88d2e40\") " pod="openstack/ovn-controller-metrics-74dm2" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.490588 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/805bbb05-c886-4e37-b535-a5115099a741-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-jgf6r\" (UID: \"805bbb05-c886-4e37-b535-a5115099a741\") " pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.490620 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2610ac31-191b-4b34-8b6c-2362a88d2e40-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-74dm2\" (UID: \"2610ac31-191b-4b34-8b6c-2362a88d2e40\") " pod="openstack/ovn-controller-metrics-74dm2" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.490641 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjxdv\" (UniqueName: \"kubernetes.io/projected/2610ac31-191b-4b34-8b6c-2362a88d2e40-kube-api-access-kjxdv\") pod \"ovn-controller-metrics-74dm2\" (UID: \"2610ac31-191b-4b34-8b6c-2362a88d2e40\") " pod="openstack/ovn-controller-metrics-74dm2" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.490741 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n46d9\" (UniqueName: \"kubernetes.io/projected/0ab234df-237b-46e5-b663-dec7e72c4476-kube-api-access-n46d9\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.490761 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jshdl\" (UniqueName: \"kubernetes.io/projected/2353f8d1-e2bf-488c-9f9c-08147b19b8a3-kube-api-access-jshdl\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.490772 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2353f8d1-e2bf-488c-9f9c-08147b19b8a3-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.490782 4952 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2353f8d1-e2bf-488c-9f9c-08147b19b8a3-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.491103 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2610ac31-191b-4b34-8b6c-2362a88d2e40-ovs-rundir\") pod \"ovn-controller-metrics-74dm2\" (UID: \"2610ac31-191b-4b34-8b6c-2362a88d2e40\") " pod="openstack/ovn-controller-metrics-74dm2" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.491476 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2610ac31-191b-4b34-8b6c-2362a88d2e40-ovn-rundir\") pod \"ovn-controller-metrics-74dm2\" (UID: \"2610ac31-191b-4b34-8b6c-2362a88d2e40\") " pod="openstack/ovn-controller-metrics-74dm2" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.492144 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2610ac31-191b-4b34-8b6c-2362a88d2e40-config\") pod \"ovn-controller-metrics-74dm2\" (UID: \"2610ac31-191b-4b34-8b6c-2362a88d2e40\") " pod="openstack/ovn-controller-metrics-74dm2" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.496299 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2610ac31-191b-4b34-8b6c-2362a88d2e40-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-74dm2\" (UID: \"2610ac31-191b-4b34-8b6c-2362a88d2e40\") " pod="openstack/ovn-controller-metrics-74dm2" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.496310 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2610ac31-191b-4b34-8b6c-2362a88d2e40-combined-ca-bundle\") pod \"ovn-controller-metrics-74dm2\" (UID: \"2610ac31-191b-4b34-8b6c-2362a88d2e40\") " pod="openstack/ovn-controller-metrics-74dm2" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.514738 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjxdv\" (UniqueName: \"kubernetes.io/projected/2610ac31-191b-4b34-8b6c-2362a88d2e40-kube-api-access-kjxdv\") pod \"ovn-controller-metrics-74dm2\" (UID: \"2610ac31-191b-4b34-8b6c-2362a88d2e40\") " pod="openstack/ovn-controller-metrics-74dm2" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.544463 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-74dm2" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.593802 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/805bbb05-c886-4e37-b535-a5115099a741-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-jgf6r\" (UID: \"805bbb05-c886-4e37-b535-a5115099a741\") " pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.593888 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzwrd\" (UniqueName: \"kubernetes.io/projected/805bbb05-c886-4e37-b535-a5115099a741-kube-api-access-fzwrd\") pod \"dnsmasq-dns-7fd796d7df-jgf6r\" (UID: \"805bbb05-c886-4e37-b535-a5115099a741\") " pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.594648 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.594866 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/805bbb05-c886-4e37-b535-a5115099a741-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-jgf6r\" (UID: \"805bbb05-c886-4e37-b535-a5115099a741\") " pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.594982 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/805bbb05-c886-4e37-b535-a5115099a741-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-jgf6r\" (UID: \"805bbb05-c886-4e37-b535-a5115099a741\") " pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.595455 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/805bbb05-c886-4e37-b535-a5115099a741-config\") pod \"dnsmasq-dns-7fd796d7df-jgf6r\" (UID: \"805bbb05-c886-4e37-b535-a5115099a741\") " pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" Nov 22 03:09:10 crc kubenswrapper[4952]: W1122 03:09:10.597012 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda62937c_7359_4ba3_bf95_ddc5545df677.slice/crio-d0ef30e8b5339e947fe8a5ad9799c1981b3ac5a78891e16599b60181d4dda418 WatchSource:0}: Error finding container d0ef30e8b5339e947fe8a5ad9799c1981b3ac5a78891e16599b60181d4dda418: Status 404 returned error can't find the container with id d0ef30e8b5339e947fe8a5ad9799c1981b3ac5a78891e16599b60181d4dda418 Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.599362 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/805bbb05-c886-4e37-b535-a5115099a741-config\") pod \"dnsmasq-dns-7fd796d7df-jgf6r\" (UID: \"805bbb05-c886-4e37-b535-a5115099a741\") " pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.600005 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/805bbb05-c886-4e37-b535-a5115099a741-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-jgf6r\" (UID: \"805bbb05-c886-4e37-b535-a5115099a741\") " pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.617555 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzwrd\" (UniqueName: \"kubernetes.io/projected/805bbb05-c886-4e37-b535-a5115099a741-kube-api-access-fzwrd\") pod \"dnsmasq-dns-7fd796d7df-jgf6r\" (UID: \"805bbb05-c886-4e37-b535-a5115099a741\") " pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.721848 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0f098b30-65ee-4b19-9674-c384bddf0832","Type":"ContainerStarted","Data":"29d8a87169de0b8c32e51463be0fe94522e028c60bb8bbc4b985fd5372466ee6"} Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.724057 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"169f3305-945d-47b9-8764-6e37ee8863e0","Type":"ContainerStarted","Data":"bbb497f526994907e5f6f70850cd9cfa2a47684f15c41ec40032d36569031f8e"} Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.729752 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dbd8adf1-9949-4500-83d9-dbcb3d42037f","Type":"ContainerStarted","Data":"6e1440516ee0c029994d962e2b6340c684dda8a9723ac214eaa94dcb1cc11688"} Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.732396 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-blvps" event={"ID":"36a574de-e2a6-4711-82ae-a7ffc34ef5fd","Type":"ContainerStarted","Data":"8179b0052f7b1ade3a17809342bc7a38e1ac55b052053471048cf59d5376bbe7"} Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.738284 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"16617513-df98-4123-b612-9bc83023f977","Type":"ContainerStarted","Data":"328b6d07741ff453284284ebe749afe3a53affc8f28f722d102d5a288ad333ea"} Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.740513 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9kcvz" event={"ID":"82bf89be-221b-4963-bfa4-794e0eb978c6","Type":"ContainerStarted","Data":"3bd14ba427bab8987a8fde6aff5cd95f6bd3e5323b94971ddd8d8664b84baaf2"} Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.746909 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"11ba6f7e-b307-4bd0-9f84-46bcf5721c38","Type":"ContainerStarted","Data":"9710774eefc2dcbe8b9659ef22d1fa3dbb63633effe4955e710bfceec129f991"} Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.749028 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"da62937c-7359-4ba3-bf95-ddc5545df677","Type":"ContainerStarted","Data":"d0ef30e8b5339e947fe8a5ad9799c1981b3ac5a78891e16599b60181d4dda418"} Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.756259 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-bbm99" event={"ID":"f52b66b6-7362-448b-a597-0c71381a4fa0","Type":"ContainerStarted","Data":"7fee7d86b173bfbba094fcf7addaa7b51304bb4cb0dcaf6d196f6bc647035344"} Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.762501 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-xfslp" event={"ID":"2353f8d1-e2bf-488c-9f9c-08147b19b8a3","Type":"ContainerDied","Data":"120c959276a2435403389c26f9fa6f8c0cf57bb7e0138b40c8b96b1689c2ce6b"} Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.762657 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-xfslp" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.768568 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-l6t5s" event={"ID":"0ab234df-237b-46e5-b663-dec7e72c4476","Type":"ContainerDied","Data":"7d10f37c56f01e9c73c2c7f8a98bc11deb8df08476a7553135e6d9f6c650188e"} Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.768598 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-l6t5s" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.815800 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xfslp"] Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.817423 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.856989 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xfslp"] Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.883813 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-l6t5s"] Nov 22 03:09:10 crc kubenswrapper[4952]: I1122 03:09:10.964919 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-l6t5s"] Nov 22 03:09:11 crc kubenswrapper[4952]: I1122 03:09:11.176763 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-74dm2"] Nov 22 03:09:11 crc kubenswrapper[4952]: I1122 03:09:11.820948 4952 generic.go:334] "Generic (PLEG): container finished" podID="f52b66b6-7362-448b-a597-0c71381a4fa0" containerID="1ad7281454e0bb008490f99c34bb5bb9392fa4705ff3b0543cc66f90224cdb58" exitCode=0 Nov 22 03:09:11 crc kubenswrapper[4952]: I1122 03:09:11.821065 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-bbm99" event={"ID":"f52b66b6-7362-448b-a597-0c71381a4fa0","Type":"ContainerDied","Data":"1ad7281454e0bb008490f99c34bb5bb9392fa4705ff3b0543cc66f90224cdb58"} Nov 22 03:09:11 crc kubenswrapper[4952]: I1122 03:09:11.827487 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-qtwxh" event={"ID":"120e59b6-3c22-4ae8-874c-127e794a328f","Type":"ContainerDied","Data":"a5279e6f6dc5cad2251b83c28e1ec51ad5bb346462a8a1d22336edf8a058d869"} Nov 22 03:09:11 crc kubenswrapper[4952]: I1122 03:09:11.827423 4952 generic.go:334] "Generic (PLEG): container finished" podID="120e59b6-3c22-4ae8-874c-127e794a328f" containerID="a5279e6f6dc5cad2251b83c28e1ec51ad5bb346462a8a1d22336edf8a058d869" exitCode=0 Nov 22 03:09:11 crc kubenswrapper[4952]: W1122 03:09:11.929156 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2610ac31_191b_4b34_8b6c_2362a88d2e40.slice/crio-1299cfcaa6b07e4473f76aa31ce75325a22e5084223d1c401cdd90a90bb69735 WatchSource:0}: Error finding container 1299cfcaa6b07e4473f76aa31ce75325a22e5084223d1c401cdd90a90bb69735: Status 404 returned error can't find the container with id 1299cfcaa6b07e4473f76aa31ce75325a22e5084223d1c401cdd90a90bb69735 Nov 22 03:09:12 crc kubenswrapper[4952]: I1122 03:09:12.468521 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-qtwxh" Nov 22 03:09:12 crc kubenswrapper[4952]: I1122 03:09:12.542395 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ab234df-237b-46e5-b663-dec7e72c4476" path="/var/lib/kubelet/pods/0ab234df-237b-46e5-b663-dec7e72c4476/volumes" Nov 22 03:09:12 crc kubenswrapper[4952]: I1122 03:09:12.542903 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2353f8d1-e2bf-488c-9f9c-08147b19b8a3" path="/var/lib/kubelet/pods/2353f8d1-e2bf-488c-9f9c-08147b19b8a3/volumes" Nov 22 03:09:12 crc kubenswrapper[4952]: I1122 03:09:12.629458 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/120e59b6-3c22-4ae8-874c-127e794a328f-dns-svc\") pod \"120e59b6-3c22-4ae8-874c-127e794a328f\" (UID: \"120e59b6-3c22-4ae8-874c-127e794a328f\") " Nov 22 03:09:12 crc kubenswrapper[4952]: I1122 03:09:12.629567 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpwpr\" (UniqueName: \"kubernetes.io/projected/120e59b6-3c22-4ae8-874c-127e794a328f-kube-api-access-rpwpr\") pod \"120e59b6-3c22-4ae8-874c-127e794a328f\" (UID: \"120e59b6-3c22-4ae8-874c-127e794a328f\") " Nov 22 03:09:12 crc kubenswrapper[4952]: I1122 03:09:12.629650 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120e59b6-3c22-4ae8-874c-127e794a328f-config\") pod \"120e59b6-3c22-4ae8-874c-127e794a328f\" (UID: \"120e59b6-3c22-4ae8-874c-127e794a328f\") " Nov 22 03:09:12 crc kubenswrapper[4952]: I1122 03:09:12.635190 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/120e59b6-3c22-4ae8-874c-127e794a328f-kube-api-access-rpwpr" (OuterVolumeSpecName: "kube-api-access-rpwpr") pod "120e59b6-3c22-4ae8-874c-127e794a328f" (UID: "120e59b6-3c22-4ae8-874c-127e794a328f"). InnerVolumeSpecName "kube-api-access-rpwpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:12 crc kubenswrapper[4952]: I1122 03:09:12.653319 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/120e59b6-3c22-4ae8-874c-127e794a328f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "120e59b6-3c22-4ae8-874c-127e794a328f" (UID: "120e59b6-3c22-4ae8-874c-127e794a328f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:12 crc kubenswrapper[4952]: I1122 03:09:12.661841 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/120e59b6-3c22-4ae8-874c-127e794a328f-config" (OuterVolumeSpecName: "config") pod "120e59b6-3c22-4ae8-874c-127e794a328f" (UID: "120e59b6-3c22-4ae8-874c-127e794a328f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:12 crc kubenswrapper[4952]: I1122 03:09:12.732038 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120e59b6-3c22-4ae8-874c-127e794a328f-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:12 crc kubenswrapper[4952]: I1122 03:09:12.732089 4952 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/120e59b6-3c22-4ae8-874c-127e794a328f-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:12 crc kubenswrapper[4952]: I1122 03:09:12.732104 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpwpr\" (UniqueName: \"kubernetes.io/projected/120e59b6-3c22-4ae8-874c-127e794a328f-kube-api-access-rpwpr\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:12 crc kubenswrapper[4952]: I1122 03:09:12.820223 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-jgf6r"] Nov 22 03:09:12 crc kubenswrapper[4952]: I1122 03:09:12.842189 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-qtwxh" event={"ID":"120e59b6-3c22-4ae8-874c-127e794a328f","Type":"ContainerDied","Data":"f48ab05ba62aeb9b0139c7465630bafc0fe3567bf7584887e6d68d00d51bd1f3"} Nov 22 03:09:12 crc kubenswrapper[4952]: I1122 03:09:12.842263 4952 scope.go:117] "RemoveContainer" containerID="a5279e6f6dc5cad2251b83c28e1ec51ad5bb346462a8a1d22336edf8a058d869" Nov 22 03:09:12 crc kubenswrapper[4952]: I1122 03:09:12.842424 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-qtwxh" Nov 22 03:09:12 crc kubenswrapper[4952]: I1122 03:09:12.849421 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-74dm2" event={"ID":"2610ac31-191b-4b34-8b6c-2362a88d2e40","Type":"ContainerStarted","Data":"1299cfcaa6b07e4473f76aa31ce75325a22e5084223d1c401cdd90a90bb69735"} Nov 22 03:09:12 crc kubenswrapper[4952]: I1122 03:09:12.903365 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qtwxh"] Nov 22 03:09:12 crc kubenswrapper[4952]: I1122 03:09:12.905118 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qtwxh"] Nov 22 03:09:14 crc kubenswrapper[4952]: I1122 03:09:14.551165 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="120e59b6-3c22-4ae8-874c-127e794a328f" path="/var/lib/kubelet/pods/120e59b6-3c22-4ae8-874c-127e794a328f/volumes" Nov 22 03:09:14 crc kubenswrapper[4952]: W1122 03:09:14.783623 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805bbb05_c886_4e37_b535_a5115099a741.slice/crio-b947b3b068a64fda73539ab8edcfa7ab7529ed23d8b83693d35d4392ddd8419f WatchSource:0}: Error finding container b947b3b068a64fda73539ab8edcfa7ab7529ed23d8b83693d35d4392ddd8419f: Status 404 returned error can't find the container with id b947b3b068a64fda73539ab8edcfa7ab7529ed23d8b83693d35d4392ddd8419f Nov 22 03:09:14 crc kubenswrapper[4952]: I1122 03:09:14.870418 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" event={"ID":"805bbb05-c886-4e37-b535-a5115099a741","Type":"ContainerStarted","Data":"b947b3b068a64fda73539ab8edcfa7ab7529ed23d8b83693d35d4392ddd8419f"} Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.003014 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9a3eb772-8262-4b28-873f-63f00885054d","Type":"ContainerStarted","Data":"3d73b5d50a826dc12675090c2c6e43c3219665b51cca6f46e6759a427c37739d"} Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.005921 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"da62937c-7359-4ba3-bf95-ddc5545df677","Type":"ContainerStarted","Data":"b1f9b08074685ada77b92860a4777fda478ce2f1a57c4e06989adfb57307c31e"} Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.007832 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0f098b30-65ee-4b19-9674-c384bddf0832","Type":"ContainerStarted","Data":"96678be836b92540abe3989a15f722bd7059a2c5d67f2bdd10b24b149a64b5f2"} Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.011601 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-bbm99" event={"ID":"f52b66b6-7362-448b-a597-0c71381a4fa0","Type":"ContainerStarted","Data":"d623283dcc093dc4146613978684980919d2dad6f37c6e1e24e8d053c4967c58"} Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.012239 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-bbm99" Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.013966 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-74dm2" event={"ID":"2610ac31-191b-4b34-8b6c-2362a88d2e40","Type":"ContainerStarted","Data":"1b2f58077e7d084e3bb066ce303bb55dba33f40d41d47198d94361c4ebf6cd55"} Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.015936 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9kcvz" event={"ID":"82bf89be-221b-4963-bfa4-794e0eb978c6","Type":"ContainerStarted","Data":"ff4aced1d24b8a0a332e3d9dbd9877ea64e60340527632f4dc95c34ff59dae34"} Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.016457 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-9kcvz" Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.019220 4952 generic.go:334] "Generic (PLEG): container finished" podID="805bbb05-c886-4e37-b535-a5115099a741" containerID="f993bda9c117bcce0d409b2f6965cef5cdcdfdcfd454e40b186f64656c64a6f4" exitCode=0 Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.019267 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" event={"ID":"805bbb05-c886-4e37-b535-a5115099a741","Type":"ContainerDied","Data":"f993bda9c117bcce0d409b2f6965cef5cdcdfdcfd454e40b186f64656c64a6f4"} Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.023036 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"11ba6f7e-b307-4bd0-9f84-46bcf5721c38","Type":"ContainerStarted","Data":"cdaa3dbb4cca44eee3f134748c92d1936509b1b70485ce44397f68ee51eeb48b"} Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.024171 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.029571 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"169f3305-945d-47b9-8764-6e37ee8863e0","Type":"ContainerStarted","Data":"ea462fb7a75ca9eb5b09635605029fbe2d94e3a9c83f6232b93bf016f7d9d6be"} Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.029713 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.039909 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dbd8adf1-9949-4500-83d9-dbcb3d42037f","Type":"ContainerStarted","Data":"0f92b66b5a9a6c40e03e73d031376488b82c946fc621466bbcef326e5465f469"} Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.039983 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dbd8adf1-9949-4500-83d9-dbcb3d42037f","Type":"ContainerStarted","Data":"e2060f3603b354715b943355eb64c8537d39cde4529bda9b79cc0f035a8a9ec5"} Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.048782 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-blvps" event={"ID":"36a574de-e2a6-4711-82ae-a7ffc34ef5fd","Type":"ContainerStarted","Data":"ac1baa05c38751512a8ab9d435d8430eadf53e1554ff51b555d12f0894a8849d"} Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.072341 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-74dm2" podStartSLOduration=3.194848153 podStartE2EDuration="13.072318107s" podCreationTimestamp="2025-11-22 03:09:10 +0000 UTC" firstStartedPulling="2025-11-22 03:09:11.933430897 +0000 UTC m=+916.239448170" lastFinishedPulling="2025-11-22 03:09:21.810900851 +0000 UTC m=+926.116918124" observedRunningTime="2025-11-22 03:09:23.070095617 +0000 UTC m=+927.376112890" watchObservedRunningTime="2025-11-22 03:09:23.072318107 +0000 UTC m=+927.378335380" Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.097031 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9kcvz" podStartSLOduration=6.345693782 podStartE2EDuration="18.096997815s" podCreationTimestamp="2025-11-22 03:09:05 +0000 UTC" firstStartedPulling="2025-11-22 03:09:09.999045717 +0000 UTC m=+914.305062990" lastFinishedPulling="2025-11-22 03:09:21.75034975 +0000 UTC m=+926.056367023" observedRunningTime="2025-11-22 03:09:23.091414144 +0000 UTC m=+927.397431427" watchObservedRunningTime="2025-11-22 03:09:23.096997815 +0000 UTC m=+927.403015088" Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.258264 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-bbm99" podStartSLOduration=27.636006233 podStartE2EDuration="28.258232781s" podCreationTimestamp="2025-11-22 03:08:55 +0000 UTC" firstStartedPulling="2025-11-22 03:09:10.007672882 +0000 UTC m=+914.313690155" lastFinishedPulling="2025-11-22 03:09:10.62989943 +0000 UTC m=+914.935916703" observedRunningTime="2025-11-22 03:09:23.207155738 +0000 UTC m=+927.513173031" watchObservedRunningTime="2025-11-22 03:09:23.258232781 +0000 UTC m=+927.564250054" Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.293463 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=9.219923536 podStartE2EDuration="21.293430074s" podCreationTimestamp="2025-11-22 03:09:02 +0000 UTC" firstStartedPulling="2025-11-22 03:09:09.795434544 +0000 UTC m=+914.101451807" lastFinishedPulling="2025-11-22 03:09:21.868941072 +0000 UTC m=+926.174958345" observedRunningTime="2025-11-22 03:09:23.272670783 +0000 UTC m=+927.578688056" watchObservedRunningTime="2025-11-22 03:09:23.293430074 +0000 UTC m=+927.599447357" Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.374334 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.849505293 podStartE2EDuration="18.374305944s" podCreationTimestamp="2025-11-22 03:09:05 +0000 UTC" firstStartedPulling="2025-11-22 03:09:10.216090274 +0000 UTC m=+914.522107547" lastFinishedPulling="2025-11-22 03:09:21.740890925 +0000 UTC m=+926.046908198" observedRunningTime="2025-11-22 03:09:23.367460839 +0000 UTC m=+927.673478112" watchObservedRunningTime="2025-11-22 03:09:23.374305944 +0000 UTC m=+927.680323217" Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.417046 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.152603563 podStartE2EDuration="23.417013411s" podCreationTimestamp="2025-11-22 03:09:00 +0000 UTC" firstStartedPulling="2025-11-22 03:09:09.684885261 +0000 UTC m=+913.990902534" lastFinishedPulling="2025-11-22 03:09:20.949295099 +0000 UTC m=+925.255312382" observedRunningTime="2025-11-22 03:09:23.413010643 +0000 UTC m=+927.719027926" watchObservedRunningTime="2025-11-22 03:09:23.417013411 +0000 UTC m=+927.723030684" Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.762179 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bbm99"] Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.815029 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zrl6p"] Nov 22 03:09:23 crc kubenswrapper[4952]: E1122 03:09:23.815538 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="120e59b6-3c22-4ae8-874c-127e794a328f" containerName="init" Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.815584 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="120e59b6-3c22-4ae8-874c-127e794a328f" containerName="init" Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.815785 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="120e59b6-3c22-4ae8-874c-127e794a328f" containerName="init" Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.818324 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.821422 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.840161 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zrl6p"] Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.969694 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zrl6p\" (UID: \"d536449d-9892-4188-81e1-62b5c87627b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.969806 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-config\") pod \"dnsmasq-dns-86db49b7ff-zrl6p\" (UID: \"d536449d-9892-4188-81e1-62b5c87627b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.969856 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zrl6p\" (UID: \"d536449d-9892-4188-81e1-62b5c87627b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.970073 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6vhx\" (UniqueName: \"kubernetes.io/projected/d536449d-9892-4188-81e1-62b5c87627b1-kube-api-access-g6vhx\") pod \"dnsmasq-dns-86db49b7ff-zrl6p\" (UID: \"d536449d-9892-4188-81e1-62b5c87627b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:09:23 crc kubenswrapper[4952]: I1122 03:09:23.970221 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zrl6p\" (UID: \"d536449d-9892-4188-81e1-62b5c87627b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.059351 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"16617513-df98-4123-b612-9bc83023f977","Type":"ContainerStarted","Data":"1149aa85b3a1ea05d13aa42a8717bca85adc7bd1bb7d63db8328cd3d7f41c1ff"} Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.061113 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74351431-f23e-45c3-a8a5-08143737551a","Type":"ContainerStarted","Data":"16647ee3b5f63944a1cff2cb0ee6ea097461a6599cbd4eb83545d89f630e239c"} Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.062994 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" event={"ID":"805bbb05-c886-4e37-b535-a5115099a741","Type":"ContainerStarted","Data":"f4749f594647d060deeeda14cd8c826b67d37b3cabe2619a9a146cf82c6a0944"} Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.063143 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.065768 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"da62937c-7359-4ba3-bf95-ddc5545df677","Type":"ContainerStarted","Data":"b5b06fe8793b1a37317aaf82143e763b3c682fd4fa824b5d340b66245bae986c"} Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.067288 4952 generic.go:334] "Generic (PLEG): container finished" podID="36a574de-e2a6-4711-82ae-a7ffc34ef5fd" containerID="ac1baa05c38751512a8ab9d435d8430eadf53e1554ff51b555d12f0894a8849d" exitCode=0 Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.067361 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-blvps" event={"ID":"36a574de-e2a6-4711-82ae-a7ffc34ef5fd","Type":"ContainerDied","Data":"ac1baa05c38751512a8ab9d435d8430eadf53e1554ff51b555d12f0894a8849d"} Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.071996 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zrl6p\" (UID: \"d536449d-9892-4188-81e1-62b5c87627b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.072106 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6vhx\" (UniqueName: \"kubernetes.io/projected/d536449d-9892-4188-81e1-62b5c87627b1-kube-api-access-g6vhx\") pod \"dnsmasq-dns-86db49b7ff-zrl6p\" (UID: \"d536449d-9892-4188-81e1-62b5c87627b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.072178 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zrl6p\" (UID: \"d536449d-9892-4188-81e1-62b5c87627b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.072297 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zrl6p\" (UID: \"d536449d-9892-4188-81e1-62b5c87627b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.072384 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-config\") pod \"dnsmasq-dns-86db49b7ff-zrl6p\" (UID: \"d536449d-9892-4188-81e1-62b5c87627b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.073301 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zrl6p\" (UID: \"d536449d-9892-4188-81e1-62b5c87627b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.073360 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zrl6p\" (UID: \"d536449d-9892-4188-81e1-62b5c87627b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.073640 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zrl6p\" (UID: \"d536449d-9892-4188-81e1-62b5c87627b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.073640 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-config\") pod \"dnsmasq-dns-86db49b7ff-zrl6p\" (UID: \"d536449d-9892-4188-81e1-62b5c87627b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.098025 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6vhx\" (UniqueName: \"kubernetes.io/projected/d536449d-9892-4188-81e1-62b5c87627b1-kube-api-access-g6vhx\") pod \"dnsmasq-dns-86db49b7ff-zrl6p\" (UID: \"d536449d-9892-4188-81e1-62b5c87627b1\") " pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.123475 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.988705342 podStartE2EDuration="16.12344837s" podCreationTimestamp="2025-11-22 03:09:08 +0000 UTC" firstStartedPulling="2025-11-22 03:09:10.613630869 +0000 UTC m=+914.919648132" lastFinishedPulling="2025-11-22 03:09:21.748373887 +0000 UTC m=+926.054391160" observedRunningTime="2025-11-22 03:09:24.106228294 +0000 UTC m=+928.412245587" watchObservedRunningTime="2025-11-22 03:09:24.12344837 +0000 UTC m=+928.429465643" Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.145934 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.167096 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" podStartSLOduration=14.16706139 podStartE2EDuration="14.16706139s" podCreationTimestamp="2025-11-22 03:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:09:24.156499455 +0000 UTC m=+928.462516728" watchObservedRunningTime="2025-11-22 03:09:24.16706139 +0000 UTC m=+928.473078663" Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.721209 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zrl6p"] Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.801881 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:24 crc kubenswrapper[4952]: I1122 03:09:24.801950 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.079061 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-blvps" event={"ID":"36a574de-e2a6-4711-82ae-a7ffc34ef5fd","Type":"ContainerStarted","Data":"baaeea913f4db6a9dc2939715fa84caf517d58838761686939fd620af0a63aa4"} Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.079602 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-blvps" event={"ID":"36a574de-e2a6-4711-82ae-a7ffc34ef5fd","Type":"ContainerStarted","Data":"cf961caf21fd683f3779aa220eaff01d2bb0459920168b3a78644e42a6f06f38"} Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.079622 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.079634 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.080853 4952 generic.go:334] "Generic (PLEG): container finished" podID="d536449d-9892-4188-81e1-62b5c87627b1" containerID="eb86b867aa978309b9fb36a59345c210c87ff6e3adeb197ff12337e93d59e298" exitCode=0 Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.080941 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" event={"ID":"d536449d-9892-4188-81e1-62b5c87627b1","Type":"ContainerDied","Data":"eb86b867aa978309b9fb36a59345c210c87ff6e3adeb197ff12337e93d59e298"} Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.080998 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" event={"ID":"d536449d-9892-4188-81e1-62b5c87627b1","Type":"ContainerStarted","Data":"aa4c263fd382c54581956d24e33cbb1450240c937c1f54ac0a4821b538afed39"} Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.081770 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-bbm99" podUID="f52b66b6-7362-448b-a597-0c71381a4fa0" containerName="dnsmasq-dns" containerID="cri-o://d623283dcc093dc4146613978684980919d2dad6f37c6e1e24e8d053c4967c58" gracePeriod=10 Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.117232 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-blvps" podStartSLOduration=8.80147553 podStartE2EDuration="20.117144687s" podCreationTimestamp="2025-11-22 03:09:05 +0000 UTC" firstStartedPulling="2025-11-22 03:09:09.955810596 +0000 UTC m=+914.261827869" lastFinishedPulling="2025-11-22 03:09:21.271479743 +0000 UTC m=+925.577497026" observedRunningTime="2025-11-22 03:09:25.108353519 +0000 UTC m=+929.414370782" watchObservedRunningTime="2025-11-22 03:09:25.117144687 +0000 UTC m=+929.423161970" Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.139151 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.222587 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.621222 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-bbm99" Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.737411 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kgh7\" (UniqueName: \"kubernetes.io/projected/f52b66b6-7362-448b-a597-0c71381a4fa0-kube-api-access-6kgh7\") pod \"f52b66b6-7362-448b-a597-0c71381a4fa0\" (UID: \"f52b66b6-7362-448b-a597-0c71381a4fa0\") " Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.737504 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f52b66b6-7362-448b-a597-0c71381a4fa0-dns-svc\") pod \"f52b66b6-7362-448b-a597-0c71381a4fa0\" (UID: \"f52b66b6-7362-448b-a597-0c71381a4fa0\") " Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.737717 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f52b66b6-7362-448b-a597-0c71381a4fa0-config\") pod \"f52b66b6-7362-448b-a597-0c71381a4fa0\" (UID: \"f52b66b6-7362-448b-a597-0c71381a4fa0\") " Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.746141 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f52b66b6-7362-448b-a597-0c71381a4fa0-kube-api-access-6kgh7" (OuterVolumeSpecName: "kube-api-access-6kgh7") pod "f52b66b6-7362-448b-a597-0c71381a4fa0" (UID: "f52b66b6-7362-448b-a597-0c71381a4fa0"). InnerVolumeSpecName "kube-api-access-6kgh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.785329 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f52b66b6-7362-448b-a597-0c71381a4fa0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f52b66b6-7362-448b-a597-0c71381a4fa0" (UID: "f52b66b6-7362-448b-a597-0c71381a4fa0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.789193 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f52b66b6-7362-448b-a597-0c71381a4fa0-config" (OuterVolumeSpecName: "config") pod "f52b66b6-7362-448b-a597-0c71381a4fa0" (UID: "f52b66b6-7362-448b-a597-0c71381a4fa0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.839836 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kgh7\" (UniqueName: \"kubernetes.io/projected/f52b66b6-7362-448b-a597-0c71381a4fa0-kube-api-access-6kgh7\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.839912 4952 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f52b66b6-7362-448b-a597-0c71381a4fa0-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:25 crc kubenswrapper[4952]: I1122 03:09:25.839928 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f52b66b6-7362-448b-a597-0c71381a4fa0-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:26 crc kubenswrapper[4952]: I1122 03:09:26.092223 4952 generic.go:334] "Generic (PLEG): container finished" podID="f52b66b6-7362-448b-a597-0c71381a4fa0" containerID="d623283dcc093dc4146613978684980919d2dad6f37c6e1e24e8d053c4967c58" exitCode=0 Nov 22 03:09:26 crc kubenswrapper[4952]: I1122 03:09:26.092304 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-bbm99" event={"ID":"f52b66b6-7362-448b-a597-0c71381a4fa0","Type":"ContainerDied","Data":"d623283dcc093dc4146613978684980919d2dad6f37c6e1e24e8d053c4967c58"} Nov 22 03:09:26 crc kubenswrapper[4952]: I1122 03:09:26.092319 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-bbm99" Nov 22 03:09:26 crc kubenswrapper[4952]: I1122 03:09:26.092338 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-bbm99" event={"ID":"f52b66b6-7362-448b-a597-0c71381a4fa0","Type":"ContainerDied","Data":"7fee7d86b173bfbba094fcf7addaa7b51304bb4cb0dcaf6d196f6bc647035344"} Nov 22 03:09:26 crc kubenswrapper[4952]: I1122 03:09:26.092361 4952 scope.go:117] "RemoveContainer" containerID="d623283dcc093dc4146613978684980919d2dad6f37c6e1e24e8d053c4967c58" Nov 22 03:09:26 crc kubenswrapper[4952]: I1122 03:09:26.095816 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" event={"ID":"d536449d-9892-4188-81e1-62b5c87627b1","Type":"ContainerStarted","Data":"1a24ed5e9e0ca403572dd57ce7f7eeaa09afe8f2fcfd280cb1f7fe26dfbc4c41"} Nov 22 03:09:26 crc kubenswrapper[4952]: I1122 03:09:26.096752 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:26 crc kubenswrapper[4952]: I1122 03:09:26.096775 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:09:26 crc kubenswrapper[4952]: I1122 03:09:26.138250 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" podStartSLOduration=3.138219016 podStartE2EDuration="3.138219016s" podCreationTimestamp="2025-11-22 03:09:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:09:26.129184732 +0000 UTC m=+930.435202005" watchObservedRunningTime="2025-11-22 03:09:26.138219016 +0000 UTC m=+930.444236329" Nov 22 03:09:26 crc kubenswrapper[4952]: I1122 03:09:26.153814 4952 scope.go:117] "RemoveContainer" containerID="1ad7281454e0bb008490f99c34bb5bb9392fa4705ff3b0543cc66f90224cdb58" Nov 22 03:09:26 crc kubenswrapper[4952]: I1122 03:09:26.160746 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bbm99"] Nov 22 03:09:26 crc kubenswrapper[4952]: I1122 03:09:26.170809 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bbm99"] Nov 22 03:09:26 crc kubenswrapper[4952]: I1122 03:09:26.176115 4952 scope.go:117] "RemoveContainer" containerID="d623283dcc093dc4146613978684980919d2dad6f37c6e1e24e8d053c4967c58" Nov 22 03:09:26 crc kubenswrapper[4952]: E1122 03:09:26.176895 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d623283dcc093dc4146613978684980919d2dad6f37c6e1e24e8d053c4967c58\": container with ID starting with d623283dcc093dc4146613978684980919d2dad6f37c6e1e24e8d053c4967c58 not found: ID does not exist" containerID="d623283dcc093dc4146613978684980919d2dad6f37c6e1e24e8d053c4967c58" Nov 22 03:09:26 crc kubenswrapper[4952]: I1122 03:09:26.176970 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d623283dcc093dc4146613978684980919d2dad6f37c6e1e24e8d053c4967c58"} err="failed to get container status \"d623283dcc093dc4146613978684980919d2dad6f37c6e1e24e8d053c4967c58\": rpc error: code = NotFound desc = could not find container \"d623283dcc093dc4146613978684980919d2dad6f37c6e1e24e8d053c4967c58\": container with ID starting with d623283dcc093dc4146613978684980919d2dad6f37c6e1e24e8d053c4967c58 not found: ID does not exist" Nov 22 03:09:26 crc kubenswrapper[4952]: I1122 03:09:26.177018 4952 scope.go:117] "RemoveContainer" containerID="1ad7281454e0bb008490f99c34bb5bb9392fa4705ff3b0543cc66f90224cdb58" Nov 22 03:09:26 crc kubenswrapper[4952]: E1122 03:09:26.177467 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ad7281454e0bb008490f99c34bb5bb9392fa4705ff3b0543cc66f90224cdb58\": container with ID starting with 1ad7281454e0bb008490f99c34bb5bb9392fa4705ff3b0543cc66f90224cdb58 not found: ID does not exist" containerID="1ad7281454e0bb008490f99c34bb5bb9392fa4705ff3b0543cc66f90224cdb58" Nov 22 03:09:26 crc kubenswrapper[4952]: I1122 03:09:26.177505 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ad7281454e0bb008490f99c34bb5bb9392fa4705ff3b0543cc66f90224cdb58"} err="failed to get container status \"1ad7281454e0bb008490f99c34bb5bb9392fa4705ff3b0543cc66f90224cdb58\": rpc error: code = NotFound desc = could not find container \"1ad7281454e0bb008490f99c34bb5bb9392fa4705ff3b0543cc66f90224cdb58\": container with ID starting with 1ad7281454e0bb008490f99c34bb5bb9392fa4705ff3b0543cc66f90224cdb58 not found: ID does not exist" Nov 22 03:09:26 crc kubenswrapper[4952]: I1122 03:09:26.551022 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f52b66b6-7362-448b-a597-0c71381a4fa0" path="/var/lib/kubelet/pods/f52b66b6-7362-448b-a597-0c71381a4fa0/volumes" Nov 22 03:09:27 crc kubenswrapper[4952]: I1122 03:09:27.155391 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 22 03:09:27 crc kubenswrapper[4952]: I1122 03:09:27.848374 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:27 crc kubenswrapper[4952]: I1122 03:09:27.889415 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.101511 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 22 03:09:28 crc kubenswrapper[4952]: E1122 03:09:28.101952 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f52b66b6-7362-448b-a597-0c71381a4fa0" containerName="dnsmasq-dns" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.101973 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52b66b6-7362-448b-a597-0c71381a4fa0" containerName="dnsmasq-dns" Nov 22 03:09:28 crc kubenswrapper[4952]: E1122 03:09:28.102003 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f52b66b6-7362-448b-a597-0c71381a4fa0" containerName="init" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.102012 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52b66b6-7362-448b-a597-0c71381a4fa0" containerName="init" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.102236 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="f52b66b6-7362-448b-a597-0c71381a4fa0" containerName="dnsmasq-dns" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.103306 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.106506 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.106639 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.106763 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.111431 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-xrpg7" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.125712 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.195879 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdvzk\" (UniqueName: \"kubernetes.io/projected/fc722b47-abbf-470b-8e75-be1c5208c604-kube-api-access-vdvzk\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.195969 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc722b47-abbf-470b-8e75-be1c5208c604-scripts\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.196004 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fc722b47-abbf-470b-8e75-be1c5208c604-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.196023 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc722b47-abbf-470b-8e75-be1c5208c604-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.196635 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc722b47-abbf-470b-8e75-be1c5208c604-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.196888 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc722b47-abbf-470b-8e75-be1c5208c604-config\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.197066 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc722b47-abbf-470b-8e75-be1c5208c604-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.298566 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc722b47-abbf-470b-8e75-be1c5208c604-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.298640 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc722b47-abbf-470b-8e75-be1c5208c604-config\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.298698 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc722b47-abbf-470b-8e75-be1c5208c604-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.298734 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdvzk\" (UniqueName: \"kubernetes.io/projected/fc722b47-abbf-470b-8e75-be1c5208c604-kube-api-access-vdvzk\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.298768 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc722b47-abbf-470b-8e75-be1c5208c604-scripts\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.298790 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fc722b47-abbf-470b-8e75-be1c5208c604-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.299695 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc722b47-abbf-470b-8e75-be1c5208c604-config\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.299722 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fc722b47-abbf-470b-8e75-be1c5208c604-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.300062 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc722b47-abbf-470b-8e75-be1c5208c604-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.300190 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc722b47-abbf-470b-8e75-be1c5208c604-scripts\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.308300 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc722b47-abbf-470b-8e75-be1c5208c604-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.308427 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc722b47-abbf-470b-8e75-be1c5208c604-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.314305 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc722b47-abbf-470b-8e75-be1c5208c604-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.318435 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdvzk\" (UniqueName: \"kubernetes.io/projected/fc722b47-abbf-470b-8e75-be1c5208c604-kube-api-access-vdvzk\") pod \"ovn-northd-0\" (UID: \"fc722b47-abbf-470b-8e75-be1c5208c604\") " pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.349246 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.349706 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.423035 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 22 03:09:28 crc kubenswrapper[4952]: I1122 03:09:28.915856 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 22 03:09:29 crc kubenswrapper[4952]: I1122 03:09:29.144229 4952 generic.go:334] "Generic (PLEG): container finished" podID="9a3eb772-8262-4b28-873f-63f00885054d" containerID="3d73b5d50a826dc12675090c2c6e43c3219665b51cca6f46e6759a427c37739d" exitCode=0 Nov 22 03:09:29 crc kubenswrapper[4952]: I1122 03:09:29.144516 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9a3eb772-8262-4b28-873f-63f00885054d","Type":"ContainerDied","Data":"3d73b5d50a826dc12675090c2c6e43c3219665b51cca6f46e6759a427c37739d"} Nov 22 03:09:29 crc kubenswrapper[4952]: I1122 03:09:29.148607 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fc722b47-abbf-470b-8e75-be1c5208c604","Type":"ContainerStarted","Data":"ea7ec7531f0f898685d7b399d52ffa8ae4515b6fa5ae95819bfd630db04c6386"} Nov 22 03:09:30 crc kubenswrapper[4952]: I1122 03:09:30.554456 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 22 03:09:30 crc kubenswrapper[4952]: I1122 03:09:30.818798 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" Nov 22 03:09:31 crc kubenswrapper[4952]: I1122 03:09:31.167118 4952 generic.go:334] "Generic (PLEG): container finished" podID="0f098b30-65ee-4b19-9674-c384bddf0832" containerID="96678be836b92540abe3989a15f722bd7059a2c5d67f2bdd10b24b149a64b5f2" exitCode=0 Nov 22 03:09:31 crc kubenswrapper[4952]: I1122 03:09:31.167188 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0f098b30-65ee-4b19-9674-c384bddf0832","Type":"ContainerDied","Data":"96678be836b92540abe3989a15f722bd7059a2c5d67f2bdd10b24b149a64b5f2"} Nov 22 03:09:31 crc kubenswrapper[4952]: I1122 03:09:31.174529 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9a3eb772-8262-4b28-873f-63f00885054d","Type":"ContainerStarted","Data":"c356ebe9eb26bdf5cf0230b0be8ad382b94249b64fe0b2c33417a4adabc29233"} Nov 22 03:09:31 crc kubenswrapper[4952]: I1122 03:09:31.239626 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=22.157056669 podStartE2EDuration="34.239599892s" podCreationTimestamp="2025-11-22 03:08:57 +0000 UTC" firstStartedPulling="2025-11-22 03:09:09.665842685 +0000 UTC m=+913.971859958" lastFinishedPulling="2025-11-22 03:09:21.748385908 +0000 UTC m=+926.054403181" observedRunningTime="2025-11-22 03:09:31.234954906 +0000 UTC m=+935.540972169" watchObservedRunningTime="2025-11-22 03:09:31.239599892 +0000 UTC m=+935.545617165" Nov 22 03:09:32 crc kubenswrapper[4952]: I1122 03:09:32.188934 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fc722b47-abbf-470b-8e75-be1c5208c604","Type":"ContainerStarted","Data":"bbd1e21a5bf91791d934ec106b7cc793e051a522e093be190ad92b06ee2cdbd1"} Nov 22 03:09:32 crc kubenswrapper[4952]: I1122 03:09:32.189570 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fc722b47-abbf-470b-8e75-be1c5208c604","Type":"ContainerStarted","Data":"0a573e9b19ff8f70affb0762bd74f593c62993d9723b9d4a2cc849d758162282"} Nov 22 03:09:32 crc kubenswrapper[4952]: I1122 03:09:32.191263 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 22 03:09:32 crc kubenswrapper[4952]: I1122 03:09:32.196863 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0f098b30-65ee-4b19-9674-c384bddf0832","Type":"ContainerStarted","Data":"9081361569afbcac66b3add8c5969404df8c24dfdd69a20032ac99bc4a2a4717"} Nov 22 03:09:32 crc kubenswrapper[4952]: I1122 03:09:32.212503 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.8619815480000002 podStartE2EDuration="4.212476755s" podCreationTimestamp="2025-11-22 03:09:28 +0000 UTC" firstStartedPulling="2025-11-22 03:09:28.92586124 +0000 UTC m=+933.231878513" lastFinishedPulling="2025-11-22 03:09:31.276356447 +0000 UTC m=+935.582373720" observedRunningTime="2025-11-22 03:09:32.207314446 +0000 UTC m=+936.513331729" watchObservedRunningTime="2025-11-22 03:09:32.212476755 +0000 UTC m=+936.518494028" Nov 22 03:09:32 crc kubenswrapper[4952]: I1122 03:09:32.240486 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.594526473 podStartE2EDuration="34.240466193s" podCreationTimestamp="2025-11-22 03:08:58 +0000 UTC" firstStartedPulling="2025-11-22 03:09:09.686479994 +0000 UTC m=+913.992497267" lastFinishedPulling="2025-11-22 03:09:21.332419674 +0000 UTC m=+925.638436987" observedRunningTime="2025-11-22 03:09:32.233612368 +0000 UTC m=+936.539629641" watchObservedRunningTime="2025-11-22 03:09:32.240466193 +0000 UTC m=+936.546483466" Nov 22 03:09:32 crc kubenswrapper[4952]: I1122 03:09:32.506300 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 22 03:09:34 crc kubenswrapper[4952]: I1122 03:09:34.147761 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:09:34 crc kubenswrapper[4952]: I1122 03:09:34.271725 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-jgf6r"] Nov 22 03:09:34 crc kubenswrapper[4952]: I1122 03:09:34.272072 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" podUID="805bbb05-c886-4e37-b535-a5115099a741" containerName="dnsmasq-dns" containerID="cri-o://f4749f594647d060deeeda14cd8c826b67d37b3cabe2619a9a146cf82c6a0944" gracePeriod=10 Nov 22 03:09:34 crc kubenswrapper[4952]: I1122 03:09:34.832108 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" Nov 22 03:09:34 crc kubenswrapper[4952]: I1122 03:09:34.882460 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/805bbb05-c886-4e37-b535-a5115099a741-ovsdbserver-nb\") pod \"805bbb05-c886-4e37-b535-a5115099a741\" (UID: \"805bbb05-c886-4e37-b535-a5115099a741\") " Nov 22 03:09:34 crc kubenswrapper[4952]: I1122 03:09:34.882824 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/805bbb05-c886-4e37-b535-a5115099a741-config\") pod \"805bbb05-c886-4e37-b535-a5115099a741\" (UID: \"805bbb05-c886-4e37-b535-a5115099a741\") " Nov 22 03:09:34 crc kubenswrapper[4952]: I1122 03:09:34.883033 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzwrd\" (UniqueName: \"kubernetes.io/projected/805bbb05-c886-4e37-b535-a5115099a741-kube-api-access-fzwrd\") pod \"805bbb05-c886-4e37-b535-a5115099a741\" (UID: \"805bbb05-c886-4e37-b535-a5115099a741\") " Nov 22 03:09:34 crc kubenswrapper[4952]: I1122 03:09:34.883156 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/805bbb05-c886-4e37-b535-a5115099a741-dns-svc\") pod \"805bbb05-c886-4e37-b535-a5115099a741\" (UID: \"805bbb05-c886-4e37-b535-a5115099a741\") " Nov 22 03:09:34 crc kubenswrapper[4952]: I1122 03:09:34.892328 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/805bbb05-c886-4e37-b535-a5115099a741-kube-api-access-fzwrd" (OuterVolumeSpecName: "kube-api-access-fzwrd") pod "805bbb05-c886-4e37-b535-a5115099a741" (UID: "805bbb05-c886-4e37-b535-a5115099a741"). InnerVolumeSpecName "kube-api-access-fzwrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:34 crc kubenswrapper[4952]: I1122 03:09:34.935131 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/805bbb05-c886-4e37-b535-a5115099a741-config" (OuterVolumeSpecName: "config") pod "805bbb05-c886-4e37-b535-a5115099a741" (UID: "805bbb05-c886-4e37-b535-a5115099a741"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:34 crc kubenswrapper[4952]: I1122 03:09:34.946971 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/805bbb05-c886-4e37-b535-a5115099a741-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "805bbb05-c886-4e37-b535-a5115099a741" (UID: "805bbb05-c886-4e37-b535-a5115099a741"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:34 crc kubenswrapper[4952]: I1122 03:09:34.953972 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/805bbb05-c886-4e37-b535-a5115099a741-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "805bbb05-c886-4e37-b535-a5115099a741" (UID: "805bbb05-c886-4e37-b535-a5115099a741"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:34 crc kubenswrapper[4952]: I1122 03:09:34.985600 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/805bbb05-c886-4e37-b535-a5115099a741-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:34 crc kubenswrapper[4952]: I1122 03:09:34.985644 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzwrd\" (UniqueName: \"kubernetes.io/projected/805bbb05-c886-4e37-b535-a5115099a741-kube-api-access-fzwrd\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:34 crc kubenswrapper[4952]: I1122 03:09:34.985659 4952 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/805bbb05-c886-4e37-b535-a5115099a741-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:34 crc kubenswrapper[4952]: I1122 03:09:34.985669 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/805bbb05-c886-4e37-b535-a5115099a741-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:35 crc kubenswrapper[4952]: I1122 03:09:35.251885 4952 generic.go:334] "Generic (PLEG): container finished" podID="805bbb05-c886-4e37-b535-a5115099a741" containerID="f4749f594647d060deeeda14cd8c826b67d37b3cabe2619a9a146cf82c6a0944" exitCode=0 Nov 22 03:09:35 crc kubenswrapper[4952]: I1122 03:09:35.251977 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" event={"ID":"805bbb05-c886-4e37-b535-a5115099a741","Type":"ContainerDied","Data":"f4749f594647d060deeeda14cd8c826b67d37b3cabe2619a9a146cf82c6a0944"} Nov 22 03:09:35 crc kubenswrapper[4952]: I1122 03:09:35.252019 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" Nov 22 03:09:35 crc kubenswrapper[4952]: I1122 03:09:35.252022 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-jgf6r" event={"ID":"805bbb05-c886-4e37-b535-a5115099a741","Type":"ContainerDied","Data":"b947b3b068a64fda73539ab8edcfa7ab7529ed23d8b83693d35d4392ddd8419f"} Nov 22 03:09:35 crc kubenswrapper[4952]: I1122 03:09:35.252040 4952 scope.go:117] "RemoveContainer" containerID="f4749f594647d060deeeda14cd8c826b67d37b3cabe2619a9a146cf82c6a0944" Nov 22 03:09:35 crc kubenswrapper[4952]: I1122 03:09:35.279429 4952 scope.go:117] "RemoveContainer" containerID="f993bda9c117bcce0d409b2f6965cef5cdcdfdcfd454e40b186f64656c64a6f4" Nov 22 03:09:35 crc kubenswrapper[4952]: I1122 03:09:35.299051 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-jgf6r"] Nov 22 03:09:35 crc kubenswrapper[4952]: I1122 03:09:35.304691 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-jgf6r"] Nov 22 03:09:35 crc kubenswrapper[4952]: I1122 03:09:35.317501 4952 scope.go:117] "RemoveContainer" containerID="f4749f594647d060deeeda14cd8c826b67d37b3cabe2619a9a146cf82c6a0944" Nov 22 03:09:35 crc kubenswrapper[4952]: E1122 03:09:35.318372 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4749f594647d060deeeda14cd8c826b67d37b3cabe2619a9a146cf82c6a0944\": container with ID starting with f4749f594647d060deeeda14cd8c826b67d37b3cabe2619a9a146cf82c6a0944 not found: ID does not exist" containerID="f4749f594647d060deeeda14cd8c826b67d37b3cabe2619a9a146cf82c6a0944" Nov 22 03:09:35 crc kubenswrapper[4952]: I1122 03:09:35.318450 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4749f594647d060deeeda14cd8c826b67d37b3cabe2619a9a146cf82c6a0944"} err="failed to get container status \"f4749f594647d060deeeda14cd8c826b67d37b3cabe2619a9a146cf82c6a0944\": rpc error: code = NotFound desc = could not find container \"f4749f594647d060deeeda14cd8c826b67d37b3cabe2619a9a146cf82c6a0944\": container with ID starting with f4749f594647d060deeeda14cd8c826b67d37b3cabe2619a9a146cf82c6a0944 not found: ID does not exist" Nov 22 03:09:35 crc kubenswrapper[4952]: I1122 03:09:35.318495 4952 scope.go:117] "RemoveContainer" containerID="f993bda9c117bcce0d409b2f6965cef5cdcdfdcfd454e40b186f64656c64a6f4" Nov 22 03:09:35 crc kubenswrapper[4952]: E1122 03:09:35.319162 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f993bda9c117bcce0d409b2f6965cef5cdcdfdcfd454e40b186f64656c64a6f4\": container with ID starting with f993bda9c117bcce0d409b2f6965cef5cdcdfdcfd454e40b186f64656c64a6f4 not found: ID does not exist" containerID="f993bda9c117bcce0d409b2f6965cef5cdcdfdcfd454e40b186f64656c64a6f4" Nov 22 03:09:35 crc kubenswrapper[4952]: I1122 03:09:35.319239 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f993bda9c117bcce0d409b2f6965cef5cdcdfdcfd454e40b186f64656c64a6f4"} err="failed to get container status \"f993bda9c117bcce0d409b2f6965cef5cdcdfdcfd454e40b186f64656c64a6f4\": rpc error: code = NotFound desc = could not find container \"f993bda9c117bcce0d409b2f6965cef5cdcdfdcfd454e40b186f64656c64a6f4\": container with ID starting with f993bda9c117bcce0d409b2f6965cef5cdcdfdcfd454e40b186f64656c64a6f4 not found: ID does not exist" Nov 22 03:09:36 crc kubenswrapper[4952]: I1122 03:09:36.543216 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="805bbb05-c886-4e37-b535-a5115099a741" path="/var/lib/kubelet/pods/805bbb05-c886-4e37-b535-a5115099a741/volumes" Nov 22 03:09:38 crc kubenswrapper[4952]: I1122 03:09:38.862083 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 22 03:09:38 crc kubenswrapper[4952]: I1122 03:09:38.862497 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 22 03:09:40 crc kubenswrapper[4952]: I1122 03:09:40.299597 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:40 crc kubenswrapper[4952]: I1122 03:09:40.300500 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:41 crc kubenswrapper[4952]: I1122 03:09:41.197126 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:41 crc kubenswrapper[4952]: I1122 03:09:41.428092 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="0f098b30-65ee-4b19-9674-c384bddf0832" containerName="galera" probeResult="failure" output=< Nov 22 03:09:41 crc kubenswrapper[4952]: wsrep_local_state_comment (Joined) differs from Synced Nov 22 03:09:41 crc kubenswrapper[4952]: > Nov 22 03:09:42 crc kubenswrapper[4952]: I1122 03:09:42.417421 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 22 03:09:42 crc kubenswrapper[4952]: I1122 03:09:42.713072 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 22 03:09:42 crc kubenswrapper[4952]: I1122 03:09:42.791776 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 22 03:09:43 crc kubenswrapper[4952]: I1122 03:09:43.515009 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.154656 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5e82-account-create-c5txw"] Nov 22 03:09:50 crc kubenswrapper[4952]: E1122 03:09:50.155908 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805bbb05-c886-4e37-b535-a5115099a741" containerName="init" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.155929 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="805bbb05-c886-4e37-b535-a5115099a741" containerName="init" Nov 22 03:09:50 crc kubenswrapper[4952]: E1122 03:09:50.155988 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805bbb05-c886-4e37-b535-a5115099a741" containerName="dnsmasq-dns" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.156004 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="805bbb05-c886-4e37-b535-a5115099a741" containerName="dnsmasq-dns" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.156297 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="805bbb05-c886-4e37-b535-a5115099a741" containerName="dnsmasq-dns" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.157386 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5e82-account-create-c5txw" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.160683 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.183719 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5e82-account-create-c5txw"] Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.208059 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db8gz\" (UniqueName: \"kubernetes.io/projected/fd474404-fc22-4fec-8b02-9c536b93d36e-kube-api-access-db8gz\") pod \"keystone-5e82-account-create-c5txw\" (UID: \"fd474404-fc22-4fec-8b02-9c536b93d36e\") " pod="openstack/keystone-5e82-account-create-c5txw" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.208174 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd474404-fc22-4fec-8b02-9c536b93d36e-operator-scripts\") pod \"keystone-5e82-account-create-c5txw\" (UID: \"fd474404-fc22-4fec-8b02-9c536b93d36e\") " pod="openstack/keystone-5e82-account-create-c5txw" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.213834 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-bp7sw"] Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.218361 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bp7sw" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.241149 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bp7sw"] Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.311719 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db8gz\" (UniqueName: \"kubernetes.io/projected/fd474404-fc22-4fec-8b02-9c536b93d36e-kube-api-access-db8gz\") pod \"keystone-5e82-account-create-c5txw\" (UID: \"fd474404-fc22-4fec-8b02-9c536b93d36e\") " pod="openstack/keystone-5e82-account-create-c5txw" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.311798 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd474404-fc22-4fec-8b02-9c536b93d36e-operator-scripts\") pod \"keystone-5e82-account-create-c5txw\" (UID: \"fd474404-fc22-4fec-8b02-9c536b93d36e\") " pod="openstack/keystone-5e82-account-create-c5txw" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.311896 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/975b18ab-9e5c-4640-be72-1d2e76ebedda-operator-scripts\") pod \"keystone-db-create-bp7sw\" (UID: \"975b18ab-9e5c-4640-be72-1d2e76ebedda\") " pod="openstack/keystone-db-create-bp7sw" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.311947 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrq8k\" (UniqueName: \"kubernetes.io/projected/975b18ab-9e5c-4640-be72-1d2e76ebedda-kube-api-access-hrq8k\") pod \"keystone-db-create-bp7sw\" (UID: \"975b18ab-9e5c-4640-be72-1d2e76ebedda\") " pod="openstack/keystone-db-create-bp7sw" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.312952 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd474404-fc22-4fec-8b02-9c536b93d36e-operator-scripts\") pod \"keystone-5e82-account-create-c5txw\" (UID: \"fd474404-fc22-4fec-8b02-9c536b93d36e\") " pod="openstack/keystone-5e82-account-create-c5txw" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.335769 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db8gz\" (UniqueName: \"kubernetes.io/projected/fd474404-fc22-4fec-8b02-9c536b93d36e-kube-api-access-db8gz\") pod \"keystone-5e82-account-create-c5txw\" (UID: \"fd474404-fc22-4fec-8b02-9c536b93d36e\") " pod="openstack/keystone-5e82-account-create-c5txw" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.399506 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-b5bx2"] Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.401099 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-b5bx2" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.413708 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-b5bx2"] Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.414505 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/975b18ab-9e5c-4640-be72-1d2e76ebedda-operator-scripts\") pod \"keystone-db-create-bp7sw\" (UID: \"975b18ab-9e5c-4640-be72-1d2e76ebedda\") " pod="openstack/keystone-db-create-bp7sw" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.414608 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrq8k\" (UniqueName: \"kubernetes.io/projected/975b18ab-9e5c-4640-be72-1d2e76ebedda-kube-api-access-hrq8k\") pod \"keystone-db-create-bp7sw\" (UID: \"975b18ab-9e5c-4640-be72-1d2e76ebedda\") " pod="openstack/keystone-db-create-bp7sw" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.416125 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/975b18ab-9e5c-4640-be72-1d2e76ebedda-operator-scripts\") pod \"keystone-db-create-bp7sw\" (UID: \"975b18ab-9e5c-4640-be72-1d2e76ebedda\") " pod="openstack/keystone-db-create-bp7sw" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.436157 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrq8k\" (UniqueName: \"kubernetes.io/projected/975b18ab-9e5c-4640-be72-1d2e76ebedda-kube-api-access-hrq8k\") pod \"keystone-db-create-bp7sw\" (UID: \"975b18ab-9e5c-4640-be72-1d2e76ebedda\") " pod="openstack/keystone-db-create-bp7sw" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.502640 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5e82-account-create-c5txw" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.506534 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-dc03-account-create-p47mp"] Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.508047 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-dc03-account-create-p47mp" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.509982 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.516112 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3187f587-bdb9-4e8e-a009-e08c4d420041-operator-scripts\") pod \"placement-db-create-b5bx2\" (UID: \"3187f587-bdb9-4e8e-a009-e08c4d420041\") " pod="openstack/placement-db-create-b5bx2" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.516388 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-886g5\" (UniqueName: \"kubernetes.io/projected/3187f587-bdb9-4e8e-a009-e08c4d420041-kube-api-access-886g5\") pod \"placement-db-create-b5bx2\" (UID: \"3187f587-bdb9-4e8e-a009-e08c4d420041\") " pod="openstack/placement-db-create-b5bx2" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.525360 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-dc03-account-create-p47mp"] Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.577623 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bp7sw" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.618667 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03ed0223-bfb6-490c-b39d-3f57968b5744-operator-scripts\") pod \"placement-dc03-account-create-p47mp\" (UID: \"03ed0223-bfb6-490c-b39d-3f57968b5744\") " pod="openstack/placement-dc03-account-create-p47mp" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.618733 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3187f587-bdb9-4e8e-a009-e08c4d420041-operator-scripts\") pod \"placement-db-create-b5bx2\" (UID: \"3187f587-bdb9-4e8e-a009-e08c4d420041\") " pod="openstack/placement-db-create-b5bx2" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.618781 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-886g5\" (UniqueName: \"kubernetes.io/projected/3187f587-bdb9-4e8e-a009-e08c4d420041-kube-api-access-886g5\") pod \"placement-db-create-b5bx2\" (UID: \"3187f587-bdb9-4e8e-a009-e08c4d420041\") " pod="openstack/placement-db-create-b5bx2" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.618827 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92m6c\" (UniqueName: \"kubernetes.io/projected/03ed0223-bfb6-490c-b39d-3f57968b5744-kube-api-access-92m6c\") pod \"placement-dc03-account-create-p47mp\" (UID: \"03ed0223-bfb6-490c-b39d-3f57968b5744\") " pod="openstack/placement-dc03-account-create-p47mp" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.620405 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3187f587-bdb9-4e8e-a009-e08c4d420041-operator-scripts\") pod \"placement-db-create-b5bx2\" (UID: \"3187f587-bdb9-4e8e-a009-e08c4d420041\") " pod="openstack/placement-db-create-b5bx2" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.640480 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-886g5\" (UniqueName: \"kubernetes.io/projected/3187f587-bdb9-4e8e-a009-e08c4d420041-kube-api-access-886g5\") pod \"placement-db-create-b5bx2\" (UID: \"3187f587-bdb9-4e8e-a009-e08c4d420041\") " pod="openstack/placement-db-create-b5bx2" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.722221 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03ed0223-bfb6-490c-b39d-3f57968b5744-operator-scripts\") pod \"placement-dc03-account-create-p47mp\" (UID: \"03ed0223-bfb6-490c-b39d-3f57968b5744\") " pod="openstack/placement-dc03-account-create-p47mp" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.722438 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92m6c\" (UniqueName: \"kubernetes.io/projected/03ed0223-bfb6-490c-b39d-3f57968b5744-kube-api-access-92m6c\") pod \"placement-dc03-account-create-p47mp\" (UID: \"03ed0223-bfb6-490c-b39d-3f57968b5744\") " pod="openstack/placement-dc03-account-create-p47mp" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.723742 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03ed0223-bfb6-490c-b39d-3f57968b5744-operator-scripts\") pod \"placement-dc03-account-create-p47mp\" (UID: \"03ed0223-bfb6-490c-b39d-3f57968b5744\") " pod="openstack/placement-dc03-account-create-p47mp" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.726975 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-b5bx2" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.756586 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92m6c\" (UniqueName: \"kubernetes.io/projected/03ed0223-bfb6-490c-b39d-3f57968b5744-kube-api-access-92m6c\") pod \"placement-dc03-account-create-p47mp\" (UID: \"03ed0223-bfb6-490c-b39d-3f57968b5744\") " pod="openstack/placement-dc03-account-create-p47mp" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.763668 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-jhbw8"] Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.765301 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jhbw8" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.771426 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jhbw8"] Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.824044 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1ae67f4-66a0-4a73-80f2-eacbd1db165f-operator-scripts\") pod \"glance-db-create-jhbw8\" (UID: \"a1ae67f4-66a0-4a73-80f2-eacbd1db165f\") " pod="openstack/glance-db-create-jhbw8" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.824115 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffnk9\" (UniqueName: \"kubernetes.io/projected/a1ae67f4-66a0-4a73-80f2-eacbd1db165f-kube-api-access-ffnk9\") pod \"glance-db-create-jhbw8\" (UID: \"a1ae67f4-66a0-4a73-80f2-eacbd1db165f\") " pod="openstack/glance-db-create-jhbw8" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.827298 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-dc03-account-create-p47mp" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.879670 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-bbee-account-create-zrsv8"] Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.881021 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bbee-account-create-zrsv8" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.886196 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.888823 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bbee-account-create-zrsv8"] Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.929174 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71f4d260-3157-440f-860e-f260bb4c6052-operator-scripts\") pod \"glance-bbee-account-create-zrsv8\" (UID: \"71f4d260-3157-440f-860e-f260bb4c6052\") " pod="openstack/glance-bbee-account-create-zrsv8" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.929279 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1ae67f4-66a0-4a73-80f2-eacbd1db165f-operator-scripts\") pod \"glance-db-create-jhbw8\" (UID: \"a1ae67f4-66a0-4a73-80f2-eacbd1db165f\") " pod="openstack/glance-db-create-jhbw8" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.929328 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffnk9\" (UniqueName: \"kubernetes.io/projected/a1ae67f4-66a0-4a73-80f2-eacbd1db165f-kube-api-access-ffnk9\") pod \"glance-db-create-jhbw8\" (UID: \"a1ae67f4-66a0-4a73-80f2-eacbd1db165f\") " pod="openstack/glance-db-create-jhbw8" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.929367 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxg92\" (UniqueName: \"kubernetes.io/projected/71f4d260-3157-440f-860e-f260bb4c6052-kube-api-access-fxg92\") pod \"glance-bbee-account-create-zrsv8\" (UID: \"71f4d260-3157-440f-860e-f260bb4c6052\") " pod="openstack/glance-bbee-account-create-zrsv8" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.930275 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1ae67f4-66a0-4a73-80f2-eacbd1db165f-operator-scripts\") pod \"glance-db-create-jhbw8\" (UID: \"a1ae67f4-66a0-4a73-80f2-eacbd1db165f\") " pod="openstack/glance-db-create-jhbw8" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.948503 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffnk9\" (UniqueName: \"kubernetes.io/projected/a1ae67f4-66a0-4a73-80f2-eacbd1db165f-kube-api-access-ffnk9\") pod \"glance-db-create-jhbw8\" (UID: \"a1ae67f4-66a0-4a73-80f2-eacbd1db165f\") " pod="openstack/glance-db-create-jhbw8" Nov 22 03:09:50 crc kubenswrapper[4952]: I1122 03:09:50.982917 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5e82-account-create-c5txw"] Nov 22 03:09:50 crc kubenswrapper[4952]: W1122 03:09:50.991181 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd474404_fc22_4fec_8b02_9c536b93d36e.slice/crio-cf967ebd1c0e3cfd4d8890bc7b672382ddc5b3705263615b6f6488c338539849 WatchSource:0}: Error finding container cf967ebd1c0e3cfd4d8890bc7b672382ddc5b3705263615b6f6488c338539849: Status 404 returned error can't find the container with id cf967ebd1c0e3cfd4d8890bc7b672382ddc5b3705263615b6f6488c338539849 Nov 22 03:09:51 crc kubenswrapper[4952]: I1122 03:09:51.033143 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71f4d260-3157-440f-860e-f260bb4c6052-operator-scripts\") pod \"glance-bbee-account-create-zrsv8\" (UID: \"71f4d260-3157-440f-860e-f260bb4c6052\") " pod="openstack/glance-bbee-account-create-zrsv8" Nov 22 03:09:51 crc kubenswrapper[4952]: I1122 03:09:51.033333 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxg92\" (UniqueName: \"kubernetes.io/projected/71f4d260-3157-440f-860e-f260bb4c6052-kube-api-access-fxg92\") pod \"glance-bbee-account-create-zrsv8\" (UID: \"71f4d260-3157-440f-860e-f260bb4c6052\") " pod="openstack/glance-bbee-account-create-zrsv8" Nov 22 03:09:51 crc kubenswrapper[4952]: I1122 03:09:51.034608 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71f4d260-3157-440f-860e-f260bb4c6052-operator-scripts\") pod \"glance-bbee-account-create-zrsv8\" (UID: \"71f4d260-3157-440f-860e-f260bb4c6052\") " pod="openstack/glance-bbee-account-create-zrsv8" Nov 22 03:09:51 crc kubenswrapper[4952]: I1122 03:09:51.051849 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxg92\" (UniqueName: \"kubernetes.io/projected/71f4d260-3157-440f-860e-f260bb4c6052-kube-api-access-fxg92\") pod \"glance-bbee-account-create-zrsv8\" (UID: \"71f4d260-3157-440f-860e-f260bb4c6052\") " pod="openstack/glance-bbee-account-create-zrsv8" Nov 22 03:09:51 crc kubenswrapper[4952]: I1122 03:09:51.086619 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bp7sw"] Nov 22 03:09:51 crc kubenswrapper[4952]: I1122 03:09:51.098791 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jhbw8" Nov 22 03:09:51 crc kubenswrapper[4952]: W1122 03:09:51.111506 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod975b18ab_9e5c_4640_be72_1d2e76ebedda.slice/crio-d92ff1c4184d57691ff4ebc8ace182ca7b05b978c27afef355723e3d7db3a42c WatchSource:0}: Error finding container d92ff1c4184d57691ff4ebc8ace182ca7b05b978c27afef355723e3d7db3a42c: Status 404 returned error can't find the container with id d92ff1c4184d57691ff4ebc8ace182ca7b05b978c27afef355723e3d7db3a42c Nov 22 03:09:51 crc kubenswrapper[4952]: I1122 03:09:51.197860 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-b5bx2"] Nov 22 03:09:51 crc kubenswrapper[4952]: I1122 03:09:51.205099 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bbee-account-create-zrsv8" Nov 22 03:09:51 crc kubenswrapper[4952]: I1122 03:09:51.316148 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-dc03-account-create-p47mp"] Nov 22 03:09:51 crc kubenswrapper[4952]: W1122 03:09:51.331930 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03ed0223_bfb6_490c_b39d_3f57968b5744.slice/crio-c129cd157f45c6ee3196c90f14ed0dbcfa67c64b74edd6a57089c3430788641b WatchSource:0}: Error finding container c129cd157f45c6ee3196c90f14ed0dbcfa67c64b74edd6a57089c3430788641b: Status 404 returned error can't find the container with id c129cd157f45c6ee3196c90f14ed0dbcfa67c64b74edd6a57089c3430788641b Nov 22 03:09:51 crc kubenswrapper[4952]: I1122 03:09:51.419820 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5e82-account-create-c5txw" event={"ID":"fd474404-fc22-4fec-8b02-9c536b93d36e","Type":"ContainerStarted","Data":"99ae91e2f66a6c6e4655b1db7391481fdd2d51642ca0992f30d2b8bb833bef3c"} Nov 22 03:09:51 crc kubenswrapper[4952]: I1122 03:09:51.420328 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5e82-account-create-c5txw" event={"ID":"fd474404-fc22-4fec-8b02-9c536b93d36e","Type":"ContainerStarted","Data":"cf967ebd1c0e3cfd4d8890bc7b672382ddc5b3705263615b6f6488c338539849"} Nov 22 03:09:51 crc kubenswrapper[4952]: I1122 03:09:51.422443 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-dc03-account-create-p47mp" event={"ID":"03ed0223-bfb6-490c-b39d-3f57968b5744","Type":"ContainerStarted","Data":"c129cd157f45c6ee3196c90f14ed0dbcfa67c64b74edd6a57089c3430788641b"} Nov 22 03:09:51 crc kubenswrapper[4952]: I1122 03:09:51.423345 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bp7sw" event={"ID":"975b18ab-9e5c-4640-be72-1d2e76ebedda","Type":"ContainerStarted","Data":"15ab4dd1f69cacb3c62533c10cbf7bcfe0163cc87e2a11360cdf975863cfa9c1"} Nov 22 03:09:51 crc kubenswrapper[4952]: I1122 03:09:51.423367 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bp7sw" event={"ID":"975b18ab-9e5c-4640-be72-1d2e76ebedda","Type":"ContainerStarted","Data":"d92ff1c4184d57691ff4ebc8ace182ca7b05b978c27afef355723e3d7db3a42c"} Nov 22 03:09:51 crc kubenswrapper[4952]: I1122 03:09:51.425329 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-b5bx2" event={"ID":"3187f587-bdb9-4e8e-a009-e08c4d420041","Type":"ContainerStarted","Data":"5aeb72d6c5f5b673f638ff697ec5e31f5d26820e0f847604d1d06c6776ff5675"} Nov 22 03:09:51 crc kubenswrapper[4952]: I1122 03:09:51.462392 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-bp7sw" podStartSLOduration=1.462364568 podStartE2EDuration="1.462364568s" podCreationTimestamp="2025-11-22 03:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:09:51.456449438 +0000 UTC m=+955.762466711" watchObservedRunningTime="2025-11-22 03:09:51.462364568 +0000 UTC m=+955.768381841" Nov 22 03:09:51 crc kubenswrapper[4952]: I1122 03:09:51.462782 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5e82-account-create-c5txw" podStartSLOduration=1.462774929 podStartE2EDuration="1.462774929s" podCreationTimestamp="2025-11-22 03:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:09:51.441499943 +0000 UTC m=+955.747517216" watchObservedRunningTime="2025-11-22 03:09:51.462774929 +0000 UTC m=+955.768792202" Nov 22 03:09:51 crc kubenswrapper[4952]: I1122 03:09:51.616734 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jhbw8"] Nov 22 03:09:51 crc kubenswrapper[4952]: I1122 03:09:51.737920 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bbee-account-create-zrsv8"] Nov 22 03:09:52 crc kubenswrapper[4952]: I1122 03:09:52.472339 4952 generic.go:334] "Generic (PLEG): container finished" podID="a1ae67f4-66a0-4a73-80f2-eacbd1db165f" containerID="bcf338e4137c3abe5a2700ddbd7809bd5d4eaf979ccda9323d21b8793e269db0" exitCode=0 Nov 22 03:09:52 crc kubenswrapper[4952]: I1122 03:09:52.473611 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jhbw8" event={"ID":"a1ae67f4-66a0-4a73-80f2-eacbd1db165f","Type":"ContainerDied","Data":"bcf338e4137c3abe5a2700ddbd7809bd5d4eaf979ccda9323d21b8793e269db0"} Nov 22 03:09:52 crc kubenswrapper[4952]: I1122 03:09:52.473658 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jhbw8" event={"ID":"a1ae67f4-66a0-4a73-80f2-eacbd1db165f","Type":"ContainerStarted","Data":"970654f395e983d6658b2e2289cce79bb80d8d380234da48c55fdb2e060152b5"} Nov 22 03:09:52 crc kubenswrapper[4952]: I1122 03:09:52.489606 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-dc03-account-create-p47mp" event={"ID":"03ed0223-bfb6-490c-b39d-3f57968b5744","Type":"ContainerDied","Data":"5c530a60887afa6c89f2bd60c9848ebc583dbbd794067c75f3cfc0f2b1f119f2"} Nov 22 03:09:52 crc kubenswrapper[4952]: I1122 03:09:52.489510 4952 generic.go:334] "Generic (PLEG): container finished" podID="03ed0223-bfb6-490c-b39d-3f57968b5744" containerID="5c530a60887afa6c89f2bd60c9848ebc583dbbd794067c75f3cfc0f2b1f119f2" exitCode=0 Nov 22 03:09:52 crc kubenswrapper[4952]: I1122 03:09:52.497608 4952 generic.go:334] "Generic (PLEG): container finished" podID="71f4d260-3157-440f-860e-f260bb4c6052" containerID="840c535c7277fd81b578cda6c382ef8911bde294f5c163063bd4151e65960143" exitCode=0 Nov 22 03:09:52 crc kubenswrapper[4952]: I1122 03:09:52.497953 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bbee-account-create-zrsv8" event={"ID":"71f4d260-3157-440f-860e-f260bb4c6052","Type":"ContainerDied","Data":"840c535c7277fd81b578cda6c382ef8911bde294f5c163063bd4151e65960143"} Nov 22 03:09:52 crc kubenswrapper[4952]: I1122 03:09:52.498036 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bbee-account-create-zrsv8" event={"ID":"71f4d260-3157-440f-860e-f260bb4c6052","Type":"ContainerStarted","Data":"5a033841129034de1469f9f7b25b2d37f948289dba0737d369341cfdc589ee1a"} Nov 22 03:09:52 crc kubenswrapper[4952]: I1122 03:09:52.499877 4952 generic.go:334] "Generic (PLEG): container finished" podID="975b18ab-9e5c-4640-be72-1d2e76ebedda" containerID="15ab4dd1f69cacb3c62533c10cbf7bcfe0163cc87e2a11360cdf975863cfa9c1" exitCode=0 Nov 22 03:09:52 crc kubenswrapper[4952]: I1122 03:09:52.499952 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bp7sw" event={"ID":"975b18ab-9e5c-4640-be72-1d2e76ebedda","Type":"ContainerDied","Data":"15ab4dd1f69cacb3c62533c10cbf7bcfe0163cc87e2a11360cdf975863cfa9c1"} Nov 22 03:09:52 crc kubenswrapper[4952]: I1122 03:09:52.503851 4952 generic.go:334] "Generic (PLEG): container finished" podID="3187f587-bdb9-4e8e-a009-e08c4d420041" containerID="57e8a748cf20c980f359b1c9a53bdeec79a9eee4f038d679f0e2354e0a4ecd25" exitCode=0 Nov 22 03:09:52 crc kubenswrapper[4952]: I1122 03:09:52.504017 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-b5bx2" event={"ID":"3187f587-bdb9-4e8e-a009-e08c4d420041","Type":"ContainerDied","Data":"57e8a748cf20c980f359b1c9a53bdeec79a9eee4f038d679f0e2354e0a4ecd25"} Nov 22 03:09:52 crc kubenswrapper[4952]: I1122 03:09:52.509254 4952 generic.go:334] "Generic (PLEG): container finished" podID="fd474404-fc22-4fec-8b02-9c536b93d36e" containerID="99ae91e2f66a6c6e4655b1db7391481fdd2d51642ca0992f30d2b8bb833bef3c" exitCode=0 Nov 22 03:09:52 crc kubenswrapper[4952]: I1122 03:09:52.509312 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5e82-account-create-c5txw" event={"ID":"fd474404-fc22-4fec-8b02-9c536b93d36e","Type":"ContainerDied","Data":"99ae91e2f66a6c6e4655b1db7391481fdd2d51642ca0992f30d2b8bb833bef3c"} Nov 22 03:09:53 crc kubenswrapper[4952]: I1122 03:09:53.984620 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-b5bx2" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.120426 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3187f587-bdb9-4e8e-a009-e08c4d420041-operator-scripts\") pod \"3187f587-bdb9-4e8e-a009-e08c4d420041\" (UID: \"3187f587-bdb9-4e8e-a009-e08c4d420041\") " Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.120757 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-886g5\" (UniqueName: \"kubernetes.io/projected/3187f587-bdb9-4e8e-a009-e08c4d420041-kube-api-access-886g5\") pod \"3187f587-bdb9-4e8e-a009-e08c4d420041\" (UID: \"3187f587-bdb9-4e8e-a009-e08c4d420041\") " Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.121907 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3187f587-bdb9-4e8e-a009-e08c4d420041-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3187f587-bdb9-4e8e-a009-e08c4d420041" (UID: "3187f587-bdb9-4e8e-a009-e08c4d420041"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.146825 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3187f587-bdb9-4e8e-a009-e08c4d420041-kube-api-access-886g5" (OuterVolumeSpecName: "kube-api-access-886g5") pod "3187f587-bdb9-4e8e-a009-e08c4d420041" (UID: "3187f587-bdb9-4e8e-a009-e08c4d420041"). InnerVolumeSpecName "kube-api-access-886g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.222989 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-886g5\" (UniqueName: \"kubernetes.io/projected/3187f587-bdb9-4e8e-a009-e08c4d420041-kube-api-access-886g5\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.223040 4952 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3187f587-bdb9-4e8e-a009-e08c4d420041-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.267164 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-dc03-account-create-p47mp" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.273144 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5e82-account-create-c5txw" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.280687 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bp7sw" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.292633 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bbee-account-create-zrsv8" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.302811 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jhbw8" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.425989 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffnk9\" (UniqueName: \"kubernetes.io/projected/a1ae67f4-66a0-4a73-80f2-eacbd1db165f-kube-api-access-ffnk9\") pod \"a1ae67f4-66a0-4a73-80f2-eacbd1db165f\" (UID: \"a1ae67f4-66a0-4a73-80f2-eacbd1db165f\") " Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.426040 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrq8k\" (UniqueName: \"kubernetes.io/projected/975b18ab-9e5c-4640-be72-1d2e76ebedda-kube-api-access-hrq8k\") pod \"975b18ab-9e5c-4640-be72-1d2e76ebedda\" (UID: \"975b18ab-9e5c-4640-be72-1d2e76ebedda\") " Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.426098 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/975b18ab-9e5c-4640-be72-1d2e76ebedda-operator-scripts\") pod \"975b18ab-9e5c-4640-be72-1d2e76ebedda\" (UID: \"975b18ab-9e5c-4640-be72-1d2e76ebedda\") " Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.426186 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92m6c\" (UniqueName: \"kubernetes.io/projected/03ed0223-bfb6-490c-b39d-3f57968b5744-kube-api-access-92m6c\") pod \"03ed0223-bfb6-490c-b39d-3f57968b5744\" (UID: \"03ed0223-bfb6-490c-b39d-3f57968b5744\") " Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.426204 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxg92\" (UniqueName: \"kubernetes.io/projected/71f4d260-3157-440f-860e-f260bb4c6052-kube-api-access-fxg92\") pod \"71f4d260-3157-440f-860e-f260bb4c6052\" (UID: \"71f4d260-3157-440f-860e-f260bb4c6052\") " Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.426225 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db8gz\" (UniqueName: \"kubernetes.io/projected/fd474404-fc22-4fec-8b02-9c536b93d36e-kube-api-access-db8gz\") pod \"fd474404-fc22-4fec-8b02-9c536b93d36e\" (UID: \"fd474404-fc22-4fec-8b02-9c536b93d36e\") " Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.426305 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd474404-fc22-4fec-8b02-9c536b93d36e-operator-scripts\") pod \"fd474404-fc22-4fec-8b02-9c536b93d36e\" (UID: \"fd474404-fc22-4fec-8b02-9c536b93d36e\") " Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.426368 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03ed0223-bfb6-490c-b39d-3f57968b5744-operator-scripts\") pod \"03ed0223-bfb6-490c-b39d-3f57968b5744\" (UID: \"03ed0223-bfb6-490c-b39d-3f57968b5744\") " Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.426431 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71f4d260-3157-440f-860e-f260bb4c6052-operator-scripts\") pod \"71f4d260-3157-440f-860e-f260bb4c6052\" (UID: \"71f4d260-3157-440f-860e-f260bb4c6052\") " Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.426464 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1ae67f4-66a0-4a73-80f2-eacbd1db165f-operator-scripts\") pod \"a1ae67f4-66a0-4a73-80f2-eacbd1db165f\" (UID: \"a1ae67f4-66a0-4a73-80f2-eacbd1db165f\") " Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.426882 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03ed0223-bfb6-490c-b39d-3f57968b5744-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "03ed0223-bfb6-490c-b39d-3f57968b5744" (UID: "03ed0223-bfb6-490c-b39d-3f57968b5744"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.426909 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd474404-fc22-4fec-8b02-9c536b93d36e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fd474404-fc22-4fec-8b02-9c536b93d36e" (UID: "fd474404-fc22-4fec-8b02-9c536b93d36e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.427088 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71f4d260-3157-440f-860e-f260bb4c6052-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "71f4d260-3157-440f-860e-f260bb4c6052" (UID: "71f4d260-3157-440f-860e-f260bb4c6052"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.427162 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1ae67f4-66a0-4a73-80f2-eacbd1db165f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a1ae67f4-66a0-4a73-80f2-eacbd1db165f" (UID: "a1ae67f4-66a0-4a73-80f2-eacbd1db165f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.427183 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/975b18ab-9e5c-4640-be72-1d2e76ebedda-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "975b18ab-9e5c-4640-be72-1d2e76ebedda" (UID: "975b18ab-9e5c-4640-be72-1d2e76ebedda"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.430061 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd474404-fc22-4fec-8b02-9c536b93d36e-kube-api-access-db8gz" (OuterVolumeSpecName: "kube-api-access-db8gz") pod "fd474404-fc22-4fec-8b02-9c536b93d36e" (UID: "fd474404-fc22-4fec-8b02-9c536b93d36e"). InnerVolumeSpecName "kube-api-access-db8gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.430145 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/975b18ab-9e5c-4640-be72-1d2e76ebedda-kube-api-access-hrq8k" (OuterVolumeSpecName: "kube-api-access-hrq8k") pod "975b18ab-9e5c-4640-be72-1d2e76ebedda" (UID: "975b18ab-9e5c-4640-be72-1d2e76ebedda"). InnerVolumeSpecName "kube-api-access-hrq8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.430284 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03ed0223-bfb6-490c-b39d-3f57968b5744-kube-api-access-92m6c" (OuterVolumeSpecName: "kube-api-access-92m6c") pod "03ed0223-bfb6-490c-b39d-3f57968b5744" (UID: "03ed0223-bfb6-490c-b39d-3f57968b5744"). InnerVolumeSpecName "kube-api-access-92m6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.430401 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1ae67f4-66a0-4a73-80f2-eacbd1db165f-kube-api-access-ffnk9" (OuterVolumeSpecName: "kube-api-access-ffnk9") pod "a1ae67f4-66a0-4a73-80f2-eacbd1db165f" (UID: "a1ae67f4-66a0-4a73-80f2-eacbd1db165f"). InnerVolumeSpecName "kube-api-access-ffnk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.432243 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71f4d260-3157-440f-860e-f260bb4c6052-kube-api-access-fxg92" (OuterVolumeSpecName: "kube-api-access-fxg92") pod "71f4d260-3157-440f-860e-f260bb4c6052" (UID: "71f4d260-3157-440f-860e-f260bb4c6052"). InnerVolumeSpecName "kube-api-access-fxg92". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.533326 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bp7sw" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.533959 4952 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd474404-fc22-4fec-8b02-9c536b93d36e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.534691 4952 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03ed0223-bfb6-490c-b39d-3f57968b5744-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.534724 4952 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71f4d260-3157-440f-860e-f260bb4c6052-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.534741 4952 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1ae67f4-66a0-4a73-80f2-eacbd1db165f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.534756 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffnk9\" (UniqueName: \"kubernetes.io/projected/a1ae67f4-66a0-4a73-80f2-eacbd1db165f-kube-api-access-ffnk9\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.534777 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrq8k\" (UniqueName: \"kubernetes.io/projected/975b18ab-9e5c-4640-be72-1d2e76ebedda-kube-api-access-hrq8k\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.534791 4952 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/975b18ab-9e5c-4640-be72-1d2e76ebedda-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.534804 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92m6c\" (UniqueName: \"kubernetes.io/projected/03ed0223-bfb6-490c-b39d-3f57968b5744-kube-api-access-92m6c\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.534823 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxg92\" (UniqueName: \"kubernetes.io/projected/71f4d260-3157-440f-860e-f260bb4c6052-kube-api-access-fxg92\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.534835 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db8gz\" (UniqueName: \"kubernetes.io/projected/fd474404-fc22-4fec-8b02-9c536b93d36e-kube-api-access-db8gz\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.535883 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-b5bx2" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.548620 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5e82-account-create-c5txw" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.553406 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jhbw8" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.556792 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-dc03-account-create-p47mp" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.561678 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bbee-account-create-zrsv8" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.569757 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bp7sw" event={"ID":"975b18ab-9e5c-4640-be72-1d2e76ebedda","Type":"ContainerDied","Data":"d92ff1c4184d57691ff4ebc8ace182ca7b05b978c27afef355723e3d7db3a42c"} Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.570596 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d92ff1c4184d57691ff4ebc8ace182ca7b05b978c27afef355723e3d7db3a42c" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.570660 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-b5bx2" event={"ID":"3187f587-bdb9-4e8e-a009-e08c4d420041","Type":"ContainerDied","Data":"5aeb72d6c5f5b673f638ff697ec5e31f5d26820e0f847604d1d06c6776ff5675"} Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.570745 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5aeb72d6c5f5b673f638ff697ec5e31f5d26820e0f847604d1d06c6776ff5675" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.570761 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5e82-account-create-c5txw" event={"ID":"fd474404-fc22-4fec-8b02-9c536b93d36e","Type":"ContainerDied","Data":"cf967ebd1c0e3cfd4d8890bc7b672382ddc5b3705263615b6f6488c338539849"} Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.570779 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf967ebd1c0e3cfd4d8890bc7b672382ddc5b3705263615b6f6488c338539849" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.570824 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jhbw8" event={"ID":"a1ae67f4-66a0-4a73-80f2-eacbd1db165f","Type":"ContainerDied","Data":"970654f395e983d6658b2e2289cce79bb80d8d380234da48c55fdb2e060152b5"} Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.570844 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="970654f395e983d6658b2e2289cce79bb80d8d380234da48c55fdb2e060152b5" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.570859 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-dc03-account-create-p47mp" event={"ID":"03ed0223-bfb6-490c-b39d-3f57968b5744","Type":"ContainerDied","Data":"c129cd157f45c6ee3196c90f14ed0dbcfa67c64b74edd6a57089c3430788641b"} Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.570870 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c129cd157f45c6ee3196c90f14ed0dbcfa67c64b74edd6a57089c3430788641b" Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.570907 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bbee-account-create-zrsv8" event={"ID":"71f4d260-3157-440f-860e-f260bb4c6052","Type":"ContainerDied","Data":"5a033841129034de1469f9f7b25b2d37f948289dba0737d369341cfdc589ee1a"} Nov 22 03:09:54 crc kubenswrapper[4952]: I1122 03:09:54.570924 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a033841129034de1469f9f7b25b2d37f948289dba0737d369341cfdc589ee1a" Nov 22 03:09:55 crc kubenswrapper[4952]: I1122 03:09:55.573264 4952 generic.go:334] "Generic (PLEG): container finished" podID="16617513-df98-4123-b612-9bc83023f977" containerID="1149aa85b3a1ea05d13aa42a8717bca85adc7bd1bb7d63db8328cd3d7f41c1ff" exitCode=0 Nov 22 03:09:55 crc kubenswrapper[4952]: I1122 03:09:55.573361 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"16617513-df98-4123-b612-9bc83023f977","Type":"ContainerDied","Data":"1149aa85b3a1ea05d13aa42a8717bca85adc7bd1bb7d63db8328cd3d7f41c1ff"} Nov 22 03:09:55 crc kubenswrapper[4952]: I1122 03:09:55.680644 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9kcvz" podUID="82bf89be-221b-4963-bfa4-794e0eb978c6" containerName="ovn-controller" probeResult="failure" output=< Nov 22 03:09:55 crc kubenswrapper[4952]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 22 03:09:55 crc kubenswrapper[4952]: > Nov 22 03:09:55 crc kubenswrapper[4952]: I1122 03:09:55.758769 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:55 crc kubenswrapper[4952]: I1122 03:09:55.765255 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-blvps" Nov 22 03:09:55 crc kubenswrapper[4952]: I1122 03:09:55.998510 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9kcvz-config-9j4dk"] Nov 22 03:09:55 crc kubenswrapper[4952]: E1122 03:09:55.999368 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ae67f4-66a0-4a73-80f2-eacbd1db165f" containerName="mariadb-database-create" Nov 22 03:09:55 crc kubenswrapper[4952]: I1122 03:09:55.999450 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ae67f4-66a0-4a73-80f2-eacbd1db165f" containerName="mariadb-database-create" Nov 22 03:09:55 crc kubenswrapper[4952]: E1122 03:09:55.999524 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71f4d260-3157-440f-860e-f260bb4c6052" containerName="mariadb-account-create" Nov 22 03:09:55 crc kubenswrapper[4952]: I1122 03:09:55.999604 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="71f4d260-3157-440f-860e-f260bb4c6052" containerName="mariadb-account-create" Nov 22 03:09:55 crc kubenswrapper[4952]: E1122 03:09:55.999667 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd474404-fc22-4fec-8b02-9c536b93d36e" containerName="mariadb-account-create" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.000708 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd474404-fc22-4fec-8b02-9c536b93d36e" containerName="mariadb-account-create" Nov 22 03:09:56 crc kubenswrapper[4952]: E1122 03:09:56.000846 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03ed0223-bfb6-490c-b39d-3f57968b5744" containerName="mariadb-account-create" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.001020 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ed0223-bfb6-490c-b39d-3f57968b5744" containerName="mariadb-account-create" Nov 22 03:09:56 crc kubenswrapper[4952]: E1122 03:09:56.001895 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="975b18ab-9e5c-4640-be72-1d2e76ebedda" containerName="mariadb-database-create" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.001999 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="975b18ab-9e5c-4640-be72-1d2e76ebedda" containerName="mariadb-database-create" Nov 22 03:09:56 crc kubenswrapper[4952]: E1122 03:09:56.002087 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3187f587-bdb9-4e8e-a009-e08c4d420041" containerName="mariadb-database-create" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.002173 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="3187f587-bdb9-4e8e-a009-e08c4d420041" containerName="mariadb-database-create" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.002723 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd474404-fc22-4fec-8b02-9c536b93d36e" containerName="mariadb-account-create" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.002823 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="71f4d260-3157-440f-860e-f260bb4c6052" containerName="mariadb-account-create" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.002888 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1ae67f4-66a0-4a73-80f2-eacbd1db165f" containerName="mariadb-database-create" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.002948 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="03ed0223-bfb6-490c-b39d-3f57968b5744" containerName="mariadb-account-create" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.003013 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="3187f587-bdb9-4e8e-a009-e08c4d420041" containerName="mariadb-database-create" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.003082 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="975b18ab-9e5c-4640-be72-1d2e76ebedda" containerName="mariadb-database-create" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.004054 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.006671 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.015222 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9kcvz-config-9j4dk"] Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.072343 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a68a727e-a1df-4627-8b4f-78e8293b58c6-var-run\") pod \"ovn-controller-9kcvz-config-9j4dk\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.072514 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a68a727e-a1df-4627-8b4f-78e8293b58c6-additional-scripts\") pod \"ovn-controller-9kcvz-config-9j4dk\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.072592 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a68a727e-a1df-4627-8b4f-78e8293b58c6-scripts\") pod \"ovn-controller-9kcvz-config-9j4dk\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.072626 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcz6r\" (UniqueName: \"kubernetes.io/projected/a68a727e-a1df-4627-8b4f-78e8293b58c6-kube-api-access-jcz6r\") pod \"ovn-controller-9kcvz-config-9j4dk\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.072683 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a68a727e-a1df-4627-8b4f-78e8293b58c6-var-log-ovn\") pod \"ovn-controller-9kcvz-config-9j4dk\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.072722 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a68a727e-a1df-4627-8b4f-78e8293b58c6-var-run-ovn\") pod \"ovn-controller-9kcvz-config-9j4dk\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.096458 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-hcq9n"] Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.097947 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hcq9n" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.100464 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.100961 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mtlf4" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.113335 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hcq9n"] Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.173917 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a68a727e-a1df-4627-8b4f-78e8293b58c6-scripts\") pod \"ovn-controller-9kcvz-config-9j4dk\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.173974 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcz6r\" (UniqueName: \"kubernetes.io/projected/a68a727e-a1df-4627-8b4f-78e8293b58c6-kube-api-access-jcz6r\") pod \"ovn-controller-9kcvz-config-9j4dk\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.174026 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a68a727e-a1df-4627-8b4f-78e8293b58c6-var-log-ovn\") pod \"ovn-controller-9kcvz-config-9j4dk\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.174066 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a68a727e-a1df-4627-8b4f-78e8293b58c6-var-run-ovn\") pod \"ovn-controller-9kcvz-config-9j4dk\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.174090 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dceab59-b76c-4c04-b00f-d81a39fd90ab-combined-ca-bundle\") pod \"glance-db-sync-hcq9n\" (UID: \"7dceab59-b76c-4c04-b00f-d81a39fd90ab\") " pod="openstack/glance-db-sync-hcq9n" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.174120 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a68a727e-a1df-4627-8b4f-78e8293b58c6-var-run\") pod \"ovn-controller-9kcvz-config-9j4dk\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.174144 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r72qt\" (UniqueName: \"kubernetes.io/projected/7dceab59-b76c-4c04-b00f-d81a39fd90ab-kube-api-access-r72qt\") pod \"glance-db-sync-hcq9n\" (UID: \"7dceab59-b76c-4c04-b00f-d81a39fd90ab\") " pod="openstack/glance-db-sync-hcq9n" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.174189 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7dceab59-b76c-4c04-b00f-d81a39fd90ab-db-sync-config-data\") pod \"glance-db-sync-hcq9n\" (UID: \"7dceab59-b76c-4c04-b00f-d81a39fd90ab\") " pod="openstack/glance-db-sync-hcq9n" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.174214 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dceab59-b76c-4c04-b00f-d81a39fd90ab-config-data\") pod \"glance-db-sync-hcq9n\" (UID: \"7dceab59-b76c-4c04-b00f-d81a39fd90ab\") " pod="openstack/glance-db-sync-hcq9n" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.174242 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a68a727e-a1df-4627-8b4f-78e8293b58c6-additional-scripts\") pod \"ovn-controller-9kcvz-config-9j4dk\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.174850 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a68a727e-a1df-4627-8b4f-78e8293b58c6-var-run\") pod \"ovn-controller-9kcvz-config-9j4dk\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.174856 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a68a727e-a1df-4627-8b4f-78e8293b58c6-var-log-ovn\") pod \"ovn-controller-9kcvz-config-9j4dk\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.174929 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a68a727e-a1df-4627-8b4f-78e8293b58c6-var-run-ovn\") pod \"ovn-controller-9kcvz-config-9j4dk\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.175064 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a68a727e-a1df-4627-8b4f-78e8293b58c6-additional-scripts\") pod \"ovn-controller-9kcvz-config-9j4dk\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.195505 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcz6r\" (UniqueName: \"kubernetes.io/projected/a68a727e-a1df-4627-8b4f-78e8293b58c6-kube-api-access-jcz6r\") pod \"ovn-controller-9kcvz-config-9j4dk\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.210169 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a68a727e-a1df-4627-8b4f-78e8293b58c6-scripts\") pod \"ovn-controller-9kcvz-config-9j4dk\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.275923 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dceab59-b76c-4c04-b00f-d81a39fd90ab-combined-ca-bundle\") pod \"glance-db-sync-hcq9n\" (UID: \"7dceab59-b76c-4c04-b00f-d81a39fd90ab\") " pod="openstack/glance-db-sync-hcq9n" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.276503 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r72qt\" (UniqueName: \"kubernetes.io/projected/7dceab59-b76c-4c04-b00f-d81a39fd90ab-kube-api-access-r72qt\") pod \"glance-db-sync-hcq9n\" (UID: \"7dceab59-b76c-4c04-b00f-d81a39fd90ab\") " pod="openstack/glance-db-sync-hcq9n" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.276590 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7dceab59-b76c-4c04-b00f-d81a39fd90ab-db-sync-config-data\") pod \"glance-db-sync-hcq9n\" (UID: \"7dceab59-b76c-4c04-b00f-d81a39fd90ab\") " pod="openstack/glance-db-sync-hcq9n" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.276628 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dceab59-b76c-4c04-b00f-d81a39fd90ab-config-data\") pod \"glance-db-sync-hcq9n\" (UID: \"7dceab59-b76c-4c04-b00f-d81a39fd90ab\") " pod="openstack/glance-db-sync-hcq9n" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.281322 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dceab59-b76c-4c04-b00f-d81a39fd90ab-combined-ca-bundle\") pod \"glance-db-sync-hcq9n\" (UID: \"7dceab59-b76c-4c04-b00f-d81a39fd90ab\") " pod="openstack/glance-db-sync-hcq9n" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.282907 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7dceab59-b76c-4c04-b00f-d81a39fd90ab-db-sync-config-data\") pod \"glance-db-sync-hcq9n\" (UID: \"7dceab59-b76c-4c04-b00f-d81a39fd90ab\") " pod="openstack/glance-db-sync-hcq9n" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.297294 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dceab59-b76c-4c04-b00f-d81a39fd90ab-config-data\") pod \"glance-db-sync-hcq9n\" (UID: \"7dceab59-b76c-4c04-b00f-d81a39fd90ab\") " pod="openstack/glance-db-sync-hcq9n" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.305767 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r72qt\" (UniqueName: \"kubernetes.io/projected/7dceab59-b76c-4c04-b00f-d81a39fd90ab-kube-api-access-r72qt\") pod \"glance-db-sync-hcq9n\" (UID: \"7dceab59-b76c-4c04-b00f-d81a39fd90ab\") " pod="openstack/glance-db-sync-hcq9n" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.322457 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.415057 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hcq9n" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.605012 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"16617513-df98-4123-b612-9bc83023f977","Type":"ContainerStarted","Data":"e1e8d2e02112a1c0b13b8d6c3218897c8a96f904b0b23a4220d6ee33677d651c"} Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.606403 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.609671 4952 generic.go:334] "Generic (PLEG): container finished" podID="74351431-f23e-45c3-a8a5-08143737551a" containerID="16647ee3b5f63944a1cff2cb0ee6ea097461a6599cbd4eb83545d89f630e239c" exitCode=0 Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.609735 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74351431-f23e-45c3-a8a5-08143737551a","Type":"ContainerDied","Data":"16647ee3b5f63944a1cff2cb0ee6ea097461a6599cbd4eb83545d89f630e239c"} Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.639784 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.532842248 podStartE2EDuration="1m1.639761963s" podCreationTimestamp="2025-11-22 03:08:55 +0000 UTC" firstStartedPulling="2025-11-22 03:09:09.841973094 +0000 UTC m=+914.147990367" lastFinishedPulling="2025-11-22 03:09:20.948892799 +0000 UTC m=+925.254910082" observedRunningTime="2025-11-22 03:09:56.628290082 +0000 UTC m=+960.934307365" watchObservedRunningTime="2025-11-22 03:09:56.639761963 +0000 UTC m=+960.945779236" Nov 22 03:09:56 crc kubenswrapper[4952]: I1122 03:09:56.791488 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9kcvz-config-9j4dk"] Nov 22 03:09:56 crc kubenswrapper[4952]: W1122 03:09:56.793315 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda68a727e_a1df_4627_8b4f_78e8293b58c6.slice/crio-504344e7ceb19b98f3d11f955c4b1710ce206c1a3b56fd09b8bd7ba45ecdcaf0 WatchSource:0}: Error finding container 504344e7ceb19b98f3d11f955c4b1710ce206c1a3b56fd09b8bd7ba45ecdcaf0: Status 404 returned error can't find the container with id 504344e7ceb19b98f3d11f955c4b1710ce206c1a3b56fd09b8bd7ba45ecdcaf0 Nov 22 03:09:57 crc kubenswrapper[4952]: I1122 03:09:57.106986 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hcq9n"] Nov 22 03:09:57 crc kubenswrapper[4952]: W1122 03:09:57.109616 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dceab59_b76c_4c04_b00f_d81a39fd90ab.slice/crio-5c6814edd59a9551a605efd4c0a00048380545ede29a0ce8eb02482b7bf0755d WatchSource:0}: Error finding container 5c6814edd59a9551a605efd4c0a00048380545ede29a0ce8eb02482b7bf0755d: Status 404 returned error can't find the container with id 5c6814edd59a9551a605efd4c0a00048380545ede29a0ce8eb02482b7bf0755d Nov 22 03:09:57 crc kubenswrapper[4952]: I1122 03:09:57.623246 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74351431-f23e-45c3-a8a5-08143737551a","Type":"ContainerStarted","Data":"a534a8defa6a1441dddd764b128e65631990ebc6fd61c14f9597fc18ea6815df"} Nov 22 03:09:57 crc kubenswrapper[4952]: I1122 03:09:57.624084 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 22 03:09:57 crc kubenswrapper[4952]: I1122 03:09:57.628036 4952 generic.go:334] "Generic (PLEG): container finished" podID="a68a727e-a1df-4627-8b4f-78e8293b58c6" containerID="80f01f51b674de019efd2b62df4203846c258323d85fe6d06ca876d3111deee3" exitCode=0 Nov 22 03:09:57 crc kubenswrapper[4952]: I1122 03:09:57.628210 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9kcvz-config-9j4dk" event={"ID":"a68a727e-a1df-4627-8b4f-78e8293b58c6","Type":"ContainerDied","Data":"80f01f51b674de019efd2b62df4203846c258323d85fe6d06ca876d3111deee3"} Nov 22 03:09:57 crc kubenswrapper[4952]: I1122 03:09:57.628244 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9kcvz-config-9j4dk" event={"ID":"a68a727e-a1df-4627-8b4f-78e8293b58c6","Type":"ContainerStarted","Data":"504344e7ceb19b98f3d11f955c4b1710ce206c1a3b56fd09b8bd7ba45ecdcaf0"} Nov 22 03:09:57 crc kubenswrapper[4952]: I1122 03:09:57.631303 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hcq9n" event={"ID":"7dceab59-b76c-4c04-b00f-d81a39fd90ab","Type":"ContainerStarted","Data":"5c6814edd59a9551a605efd4c0a00048380545ede29a0ce8eb02482b7bf0755d"} Nov 22 03:09:57 crc kubenswrapper[4952]: I1122 03:09:57.655720 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=50.306275222 podStartE2EDuration="1m2.655694172s" podCreationTimestamp="2025-11-22 03:08:55 +0000 UTC" firstStartedPulling="2025-11-22 03:09:09.391462205 +0000 UTC m=+913.697479468" lastFinishedPulling="2025-11-22 03:09:21.740881135 +0000 UTC m=+926.046898418" observedRunningTime="2025-11-22 03:09:57.645376643 +0000 UTC m=+961.951393926" watchObservedRunningTime="2025-11-22 03:09:57.655694172 +0000 UTC m=+961.961711455" Nov 22 03:09:58 crc kubenswrapper[4952]: I1122 03:09:58.341605 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:09:58 crc kubenswrapper[4952]: I1122 03:09:58.341990 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:09:58 crc kubenswrapper[4952]: I1122 03:09:58.342103 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 03:09:58 crc kubenswrapper[4952]: I1122 03:09:58.342959 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9abb162c6e80f1a9b9ed3e044dff4a6d18eb9dcfbe293208b96a0a02169b6b19"} pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:09:58 crc kubenswrapper[4952]: I1122 03:09:58.343084 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" containerID="cri-o://9abb162c6e80f1a9b9ed3e044dff4a6d18eb9dcfbe293208b96a0a02169b6b19" gracePeriod=600 Nov 22 03:09:58 crc kubenswrapper[4952]: I1122 03:09:58.650418 4952 generic.go:334] "Generic (PLEG): container finished" podID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerID="9abb162c6e80f1a9b9ed3e044dff4a6d18eb9dcfbe293208b96a0a02169b6b19" exitCode=0 Nov 22 03:09:58 crc kubenswrapper[4952]: I1122 03:09:58.650513 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerDied","Data":"9abb162c6e80f1a9b9ed3e044dff4a6d18eb9dcfbe293208b96a0a02169b6b19"} Nov 22 03:09:58 crc kubenswrapper[4952]: I1122 03:09:58.651021 4952 scope.go:117] "RemoveContainer" containerID="5f35d23af81d1d053b0cb10ef07f55474bcfadceb139bb522d996b063f18401b" Nov 22 03:09:58 crc kubenswrapper[4952]: I1122 03:09:58.999317 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.037940 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a68a727e-a1df-4627-8b4f-78e8293b58c6-scripts\") pod \"a68a727e-a1df-4627-8b4f-78e8293b58c6\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.038024 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a68a727e-a1df-4627-8b4f-78e8293b58c6-var-run\") pod \"a68a727e-a1df-4627-8b4f-78e8293b58c6\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.038082 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a68a727e-a1df-4627-8b4f-78e8293b58c6-var-run-ovn\") pod \"a68a727e-a1df-4627-8b4f-78e8293b58c6\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.038145 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcz6r\" (UniqueName: \"kubernetes.io/projected/a68a727e-a1df-4627-8b4f-78e8293b58c6-kube-api-access-jcz6r\") pod \"a68a727e-a1df-4627-8b4f-78e8293b58c6\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.038240 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a68a727e-a1df-4627-8b4f-78e8293b58c6-additional-scripts\") pod \"a68a727e-a1df-4627-8b4f-78e8293b58c6\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.038276 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a68a727e-a1df-4627-8b4f-78e8293b58c6-var-log-ovn\") pod \"a68a727e-a1df-4627-8b4f-78e8293b58c6\" (UID: \"a68a727e-a1df-4627-8b4f-78e8293b58c6\") " Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.038784 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a68a727e-a1df-4627-8b4f-78e8293b58c6-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a68a727e-a1df-4627-8b4f-78e8293b58c6" (UID: "a68a727e-a1df-4627-8b4f-78e8293b58c6"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.038840 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a68a727e-a1df-4627-8b4f-78e8293b58c6-var-run" (OuterVolumeSpecName: "var-run") pod "a68a727e-a1df-4627-8b4f-78e8293b58c6" (UID: "a68a727e-a1df-4627-8b4f-78e8293b58c6"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.038862 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a68a727e-a1df-4627-8b4f-78e8293b58c6-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a68a727e-a1df-4627-8b4f-78e8293b58c6" (UID: "a68a727e-a1df-4627-8b4f-78e8293b58c6"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.039259 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a68a727e-a1df-4627-8b4f-78e8293b58c6-scripts" (OuterVolumeSpecName: "scripts") pod "a68a727e-a1df-4627-8b4f-78e8293b58c6" (UID: "a68a727e-a1df-4627-8b4f-78e8293b58c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.039629 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a68a727e-a1df-4627-8b4f-78e8293b58c6-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a68a727e-a1df-4627-8b4f-78e8293b58c6" (UID: "a68a727e-a1df-4627-8b4f-78e8293b58c6"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.053880 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a68a727e-a1df-4627-8b4f-78e8293b58c6-kube-api-access-jcz6r" (OuterVolumeSpecName: "kube-api-access-jcz6r") pod "a68a727e-a1df-4627-8b4f-78e8293b58c6" (UID: "a68a727e-a1df-4627-8b4f-78e8293b58c6"). InnerVolumeSpecName "kube-api-access-jcz6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.145492 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a68a727e-a1df-4627-8b4f-78e8293b58c6-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.145905 4952 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a68a727e-a1df-4627-8b4f-78e8293b58c6-var-run\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.145919 4952 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a68a727e-a1df-4627-8b4f-78e8293b58c6-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.145935 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcz6r\" (UniqueName: \"kubernetes.io/projected/a68a727e-a1df-4627-8b4f-78e8293b58c6-kube-api-access-jcz6r\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.145952 4952 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a68a727e-a1df-4627-8b4f-78e8293b58c6-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.145966 4952 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a68a727e-a1df-4627-8b4f-78e8293b58c6-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.663414 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"f9d8e3cfbebc6d3bc61b04b622504062503fa5b2938cf86cbe1187a9e089f5b5"} Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.669023 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9kcvz-config-9j4dk" event={"ID":"a68a727e-a1df-4627-8b4f-78e8293b58c6","Type":"ContainerDied","Data":"504344e7ceb19b98f3d11f955c4b1710ce206c1a3b56fd09b8bd7ba45ecdcaf0"} Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.669086 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="504344e7ceb19b98f3d11f955c4b1710ce206c1a3b56fd09b8bd7ba45ecdcaf0" Nov 22 03:09:59 crc kubenswrapper[4952]: I1122 03:09:59.669093 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9kcvz-config-9j4dk" Nov 22 03:10:00 crc kubenswrapper[4952]: I1122 03:10:00.125113 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9kcvz-config-9j4dk"] Nov 22 03:10:00 crc kubenswrapper[4952]: I1122 03:10:00.130913 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9kcvz-config-9j4dk"] Nov 22 03:10:00 crc kubenswrapper[4952]: I1122 03:10:00.542301 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a68a727e-a1df-4627-8b4f-78e8293b58c6" path="/var/lib/kubelet/pods/a68a727e-a1df-4627-8b4f-78e8293b58c6/volumes" Nov 22 03:10:00 crc kubenswrapper[4952]: I1122 03:10:00.671030 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-9kcvz" Nov 22 03:10:07 crc kubenswrapper[4952]: I1122 03:10:07.052652 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="74351431-f23e-45c3-a8a5-08143737551a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Nov 22 03:10:07 crc kubenswrapper[4952]: I1122 03:10:07.409821 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:10:10 crc kubenswrapper[4952]: I1122 03:10:10.790434 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hcq9n" event={"ID":"7dceab59-b76c-4c04-b00f-d81a39fd90ab","Type":"ContainerStarted","Data":"6312b4ccf8e25d15976035ae6511f6b0c688ed6da4a54454d97bbcd4f5bf3829"} Nov 22 03:10:10 crc kubenswrapper[4952]: I1122 03:10:10.827878 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-hcq9n" podStartSLOduration=1.8260548920000002 podStartE2EDuration="14.827848113s" podCreationTimestamp="2025-11-22 03:09:56 +0000 UTC" firstStartedPulling="2025-11-22 03:09:57.112334619 +0000 UTC m=+961.418351892" lastFinishedPulling="2025-11-22 03:10:10.1141278 +0000 UTC m=+974.420145113" observedRunningTime="2025-11-22 03:10:10.820109455 +0000 UTC m=+975.126126768" watchObservedRunningTime="2025-11-22 03:10:10.827848113 +0000 UTC m=+975.133865376" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.052868 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.493356 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-r69xs"] Nov 22 03:10:17 crc kubenswrapper[4952]: E1122 03:10:17.493862 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a68a727e-a1df-4627-8b4f-78e8293b58c6" containerName="ovn-config" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.493888 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="a68a727e-a1df-4627-8b4f-78e8293b58c6" containerName="ovn-config" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.494108 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="a68a727e-a1df-4627-8b4f-78e8293b58c6" containerName="ovn-config" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.496275 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-r69xs" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.515611 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-r69xs"] Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.524071 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-2d41-account-create-xj9f4"] Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.525499 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2d41-account-create-xj9f4" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.527467 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.529809 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsr6j\" (UniqueName: \"kubernetes.io/projected/bf2f79bb-54b5-45f1-a95e-9923eefb464d-kube-api-access-gsr6j\") pod \"barbican-2d41-account-create-xj9f4\" (UID: \"bf2f79bb-54b5-45f1-a95e-9923eefb464d\") " pod="openstack/barbican-2d41-account-create-xj9f4" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.529854 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf2f79bb-54b5-45f1-a95e-9923eefb464d-operator-scripts\") pod \"barbican-2d41-account-create-xj9f4\" (UID: \"bf2f79bb-54b5-45f1-a95e-9923eefb464d\") " pod="openstack/barbican-2d41-account-create-xj9f4" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.529926 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afad0208-0c5d-49dd-ac0b-54871b694e7d-operator-scripts\") pod \"cinder-db-create-r69xs\" (UID: \"afad0208-0c5d-49dd-ac0b-54871b694e7d\") " pod="openstack/cinder-db-create-r69xs" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.529972 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9j2w\" (UniqueName: \"kubernetes.io/projected/afad0208-0c5d-49dd-ac0b-54871b694e7d-kube-api-access-h9j2w\") pod \"cinder-db-create-r69xs\" (UID: \"afad0208-0c5d-49dd-ac0b-54871b694e7d\") " pod="openstack/cinder-db-create-r69xs" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.544795 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2d41-account-create-xj9f4"] Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.606586 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-20da-account-create-hffxc"] Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.607973 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-20da-account-create-hffxc" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.614102 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.631309 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-fdrzh"] Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.631761 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsr6j\" (UniqueName: \"kubernetes.io/projected/bf2f79bb-54b5-45f1-a95e-9923eefb464d-kube-api-access-gsr6j\") pod \"barbican-2d41-account-create-xj9f4\" (UID: \"bf2f79bb-54b5-45f1-a95e-9923eefb464d\") " pod="openstack/barbican-2d41-account-create-xj9f4" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.631812 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf2f79bb-54b5-45f1-a95e-9923eefb464d-operator-scripts\") pod \"barbican-2d41-account-create-xj9f4\" (UID: \"bf2f79bb-54b5-45f1-a95e-9923eefb464d\") " pod="openstack/barbican-2d41-account-create-xj9f4" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.631841 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt55v\" (UniqueName: \"kubernetes.io/projected/ded3b067-1f44-401e-afd4-d10955e87f52-kube-api-access-mt55v\") pod \"cinder-20da-account-create-hffxc\" (UID: \"ded3b067-1f44-401e-afd4-d10955e87f52\") " pod="openstack/cinder-20da-account-create-hffxc" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.631919 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afad0208-0c5d-49dd-ac0b-54871b694e7d-operator-scripts\") pod \"cinder-db-create-r69xs\" (UID: \"afad0208-0c5d-49dd-ac0b-54871b694e7d\") " pod="openstack/cinder-db-create-r69xs" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.631953 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9j2w\" (UniqueName: \"kubernetes.io/projected/afad0208-0c5d-49dd-ac0b-54871b694e7d-kube-api-access-h9j2w\") pod \"cinder-db-create-r69xs\" (UID: \"afad0208-0c5d-49dd-ac0b-54871b694e7d\") " pod="openstack/cinder-db-create-r69xs" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.631981 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ded3b067-1f44-401e-afd4-d10955e87f52-operator-scripts\") pod \"cinder-20da-account-create-hffxc\" (UID: \"ded3b067-1f44-401e-afd4-d10955e87f52\") " pod="openstack/cinder-20da-account-create-hffxc" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.632746 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fdrzh" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.634525 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf2f79bb-54b5-45f1-a95e-9923eefb464d-operator-scripts\") pod \"barbican-2d41-account-create-xj9f4\" (UID: \"bf2f79bb-54b5-45f1-a95e-9923eefb464d\") " pod="openstack/barbican-2d41-account-create-xj9f4" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.635425 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afad0208-0c5d-49dd-ac0b-54871b694e7d-operator-scripts\") pod \"cinder-db-create-r69xs\" (UID: \"afad0208-0c5d-49dd-ac0b-54871b694e7d\") " pod="openstack/cinder-db-create-r69xs" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.639956 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-fdrzh"] Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.651640 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-20da-account-create-hffxc"] Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.674571 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsr6j\" (UniqueName: \"kubernetes.io/projected/bf2f79bb-54b5-45f1-a95e-9923eefb464d-kube-api-access-gsr6j\") pod \"barbican-2d41-account-create-xj9f4\" (UID: \"bf2f79bb-54b5-45f1-a95e-9923eefb464d\") " pod="openstack/barbican-2d41-account-create-xj9f4" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.684709 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9j2w\" (UniqueName: \"kubernetes.io/projected/afad0208-0c5d-49dd-ac0b-54871b694e7d-kube-api-access-h9j2w\") pod \"cinder-db-create-r69xs\" (UID: \"afad0208-0c5d-49dd-ac0b-54871b694e7d\") " pod="openstack/cinder-db-create-r69xs" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.734515 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ded3b067-1f44-401e-afd4-d10955e87f52-operator-scripts\") pod \"cinder-20da-account-create-hffxc\" (UID: \"ded3b067-1f44-401e-afd4-d10955e87f52\") " pod="openstack/cinder-20da-account-create-hffxc" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.734658 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aea8e22e-d027-4c7e-805d-20ba08d5a87d-operator-scripts\") pod \"barbican-db-create-fdrzh\" (UID: \"aea8e22e-d027-4c7e-805d-20ba08d5a87d\") " pod="openstack/barbican-db-create-fdrzh" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.734699 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl8v2\" (UniqueName: \"kubernetes.io/projected/aea8e22e-d027-4c7e-805d-20ba08d5a87d-kube-api-access-cl8v2\") pod \"barbican-db-create-fdrzh\" (UID: \"aea8e22e-d027-4c7e-805d-20ba08d5a87d\") " pod="openstack/barbican-db-create-fdrzh" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.734732 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt55v\" (UniqueName: \"kubernetes.io/projected/ded3b067-1f44-401e-afd4-d10955e87f52-kube-api-access-mt55v\") pod \"cinder-20da-account-create-hffxc\" (UID: \"ded3b067-1f44-401e-afd4-d10955e87f52\") " pod="openstack/cinder-20da-account-create-hffxc" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.735943 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ded3b067-1f44-401e-afd4-d10955e87f52-operator-scripts\") pod \"cinder-20da-account-create-hffxc\" (UID: \"ded3b067-1f44-401e-afd4-d10955e87f52\") " pod="openstack/cinder-20da-account-create-hffxc" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.757236 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt55v\" (UniqueName: \"kubernetes.io/projected/ded3b067-1f44-401e-afd4-d10955e87f52-kube-api-access-mt55v\") pod \"cinder-20da-account-create-hffxc\" (UID: \"ded3b067-1f44-401e-afd4-d10955e87f52\") " pod="openstack/cinder-20da-account-create-hffxc" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.798793 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-fjb79"] Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.800217 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fjb79" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.811307 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fjb79"] Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.815133 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-r69xs" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.851395 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2d41-account-create-xj9f4" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.895411 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c6ba510-eab8-457f-9103-2f49b46115da-operator-scripts\") pod \"neutron-db-create-fjb79\" (UID: \"5c6ba510-eab8-457f-9103-2f49b46115da\") " pod="openstack/neutron-db-create-fjb79" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.897970 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp8wb\" (UniqueName: \"kubernetes.io/projected/5c6ba510-eab8-457f-9103-2f49b46115da-kube-api-access-mp8wb\") pod \"neutron-db-create-fjb79\" (UID: \"5c6ba510-eab8-457f-9103-2f49b46115da\") " pod="openstack/neutron-db-create-fjb79" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.898151 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aea8e22e-d027-4c7e-805d-20ba08d5a87d-operator-scripts\") pod \"barbican-db-create-fdrzh\" (UID: \"aea8e22e-d027-4c7e-805d-20ba08d5a87d\") " pod="openstack/barbican-db-create-fdrzh" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.898228 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl8v2\" (UniqueName: \"kubernetes.io/projected/aea8e22e-d027-4c7e-805d-20ba08d5a87d-kube-api-access-cl8v2\") pod \"barbican-db-create-fdrzh\" (UID: \"aea8e22e-d027-4c7e-805d-20ba08d5a87d\") " pod="openstack/barbican-db-create-fdrzh" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.903882 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aea8e22e-d027-4c7e-805d-20ba08d5a87d-operator-scripts\") pod \"barbican-db-create-fdrzh\" (UID: \"aea8e22e-d027-4c7e-805d-20ba08d5a87d\") " pod="openstack/barbican-db-create-fdrzh" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.924429 4952 generic.go:334] "Generic (PLEG): container finished" podID="7dceab59-b76c-4c04-b00f-d81a39fd90ab" containerID="6312b4ccf8e25d15976035ae6511f6b0c688ed6da4a54454d97bbcd4f5bf3829" exitCode=0 Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.924492 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hcq9n" event={"ID":"7dceab59-b76c-4c04-b00f-d81a39fd90ab","Type":"ContainerDied","Data":"6312b4ccf8e25d15976035ae6511f6b0c688ed6da4a54454d97bbcd4f5bf3829"} Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.932089 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl8v2\" (UniqueName: \"kubernetes.io/projected/aea8e22e-d027-4c7e-805d-20ba08d5a87d-kube-api-access-cl8v2\") pod \"barbican-db-create-fdrzh\" (UID: \"aea8e22e-d027-4c7e-805d-20ba08d5a87d\") " pod="openstack/barbican-db-create-fdrzh" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.932656 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-20da-account-create-hffxc" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.947296 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-pq7nm"] Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.949179 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-pq7nm" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.953834 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fdrzh" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.954272 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.954534 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nwhtr" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.954690 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.957042 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.963583 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-pq7nm"] Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.988391 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c721-account-create-kckf7"] Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.989923 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c721-account-create-kckf7" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.994052 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 22 03:10:17 crc kubenswrapper[4952]: I1122 03:10:17.994348 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c721-account-create-kckf7"] Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.001019 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c6ba510-eab8-457f-9103-2f49b46115da-operator-scripts\") pod \"neutron-db-create-fjb79\" (UID: \"5c6ba510-eab8-457f-9103-2f49b46115da\") " pod="openstack/neutron-db-create-fjb79" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.001092 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp8wb\" (UniqueName: \"kubernetes.io/projected/5c6ba510-eab8-457f-9103-2f49b46115da-kube-api-access-mp8wb\") pod \"neutron-db-create-fjb79\" (UID: \"5c6ba510-eab8-457f-9103-2f49b46115da\") " pod="openstack/neutron-db-create-fjb79" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.001131 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02579246-918f-41d2-b71c-661abcdb0072-operator-scripts\") pod \"neutron-c721-account-create-kckf7\" (UID: \"02579246-918f-41d2-b71c-661abcdb0072\") " pod="openstack/neutron-c721-account-create-kckf7" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.001164 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wslpj\" (UniqueName: \"kubernetes.io/projected/02579246-918f-41d2-b71c-661abcdb0072-kube-api-access-wslpj\") pod \"neutron-c721-account-create-kckf7\" (UID: \"02579246-918f-41d2-b71c-661abcdb0072\") " pod="openstack/neutron-c721-account-create-kckf7" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.011177 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c6ba510-eab8-457f-9103-2f49b46115da-operator-scripts\") pod \"neutron-db-create-fjb79\" (UID: \"5c6ba510-eab8-457f-9103-2f49b46115da\") " pod="openstack/neutron-db-create-fjb79" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.029194 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp8wb\" (UniqueName: \"kubernetes.io/projected/5c6ba510-eab8-457f-9103-2f49b46115da-kube-api-access-mp8wb\") pod \"neutron-db-create-fjb79\" (UID: \"5c6ba510-eab8-457f-9103-2f49b46115da\") " pod="openstack/neutron-db-create-fjb79" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.103802 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wslpj\" (UniqueName: \"kubernetes.io/projected/02579246-918f-41d2-b71c-661abcdb0072-kube-api-access-wslpj\") pod \"neutron-c721-account-create-kckf7\" (UID: \"02579246-918f-41d2-b71c-661abcdb0072\") " pod="openstack/neutron-c721-account-create-kckf7" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.103911 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmsnc\" (UniqueName: \"kubernetes.io/projected/c5725796-1375-41a7-a8d6-80035aabc3d1-kube-api-access-gmsnc\") pod \"keystone-db-sync-pq7nm\" (UID: \"c5725796-1375-41a7-a8d6-80035aabc3d1\") " pod="openstack/keystone-db-sync-pq7nm" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.103938 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5725796-1375-41a7-a8d6-80035aabc3d1-config-data\") pod \"keystone-db-sync-pq7nm\" (UID: \"c5725796-1375-41a7-a8d6-80035aabc3d1\") " pod="openstack/keystone-db-sync-pq7nm" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.103989 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5725796-1375-41a7-a8d6-80035aabc3d1-combined-ca-bundle\") pod \"keystone-db-sync-pq7nm\" (UID: \"c5725796-1375-41a7-a8d6-80035aabc3d1\") " pod="openstack/keystone-db-sync-pq7nm" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.104033 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02579246-918f-41d2-b71c-661abcdb0072-operator-scripts\") pod \"neutron-c721-account-create-kckf7\" (UID: \"02579246-918f-41d2-b71c-661abcdb0072\") " pod="openstack/neutron-c721-account-create-kckf7" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.104926 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02579246-918f-41d2-b71c-661abcdb0072-operator-scripts\") pod \"neutron-c721-account-create-kckf7\" (UID: \"02579246-918f-41d2-b71c-661abcdb0072\") " pod="openstack/neutron-c721-account-create-kckf7" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.119781 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fjb79" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.130427 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wslpj\" (UniqueName: \"kubernetes.io/projected/02579246-918f-41d2-b71c-661abcdb0072-kube-api-access-wslpj\") pod \"neutron-c721-account-create-kckf7\" (UID: \"02579246-918f-41d2-b71c-661abcdb0072\") " pod="openstack/neutron-c721-account-create-kckf7" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.205872 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmsnc\" (UniqueName: \"kubernetes.io/projected/c5725796-1375-41a7-a8d6-80035aabc3d1-kube-api-access-gmsnc\") pod \"keystone-db-sync-pq7nm\" (UID: \"c5725796-1375-41a7-a8d6-80035aabc3d1\") " pod="openstack/keystone-db-sync-pq7nm" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.205935 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5725796-1375-41a7-a8d6-80035aabc3d1-config-data\") pod \"keystone-db-sync-pq7nm\" (UID: \"c5725796-1375-41a7-a8d6-80035aabc3d1\") " pod="openstack/keystone-db-sync-pq7nm" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.205997 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5725796-1375-41a7-a8d6-80035aabc3d1-combined-ca-bundle\") pod \"keystone-db-sync-pq7nm\" (UID: \"c5725796-1375-41a7-a8d6-80035aabc3d1\") " pod="openstack/keystone-db-sync-pq7nm" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.211829 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5725796-1375-41a7-a8d6-80035aabc3d1-config-data\") pod \"keystone-db-sync-pq7nm\" (UID: \"c5725796-1375-41a7-a8d6-80035aabc3d1\") " pod="openstack/keystone-db-sync-pq7nm" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.212632 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5725796-1375-41a7-a8d6-80035aabc3d1-combined-ca-bundle\") pod \"keystone-db-sync-pq7nm\" (UID: \"c5725796-1375-41a7-a8d6-80035aabc3d1\") " pod="openstack/keystone-db-sync-pq7nm" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.227166 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmsnc\" (UniqueName: \"kubernetes.io/projected/c5725796-1375-41a7-a8d6-80035aabc3d1-kube-api-access-gmsnc\") pod \"keystone-db-sync-pq7nm\" (UID: \"c5725796-1375-41a7-a8d6-80035aabc3d1\") " pod="openstack/keystone-db-sync-pq7nm" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.282487 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-pq7nm" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.337004 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c721-account-create-kckf7" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.447931 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-r69xs"] Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.546097 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2d41-account-create-xj9f4"] Nov 22 03:10:18 crc kubenswrapper[4952]: W1122 03:10:18.561781 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf2f79bb_54b5_45f1_a95e_9923eefb464d.slice/crio-8288984c2f3d2134bcafef81f360b6197f28dd9e07d4dd73b6f73cc4f2d61557 WatchSource:0}: Error finding container 8288984c2f3d2134bcafef81f360b6197f28dd9e07d4dd73b6f73cc4f2d61557: Status 404 returned error can't find the container with id 8288984c2f3d2134bcafef81f360b6197f28dd9e07d4dd73b6f73cc4f2d61557 Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.580706 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fjb79"] Nov 22 03:10:18 crc kubenswrapper[4952]: W1122 03:10:18.611419 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c6ba510_eab8_457f_9103_2f49b46115da.slice/crio-8565ad0488d3be0c81bf24d8c6af2bf80134705edd7f8f77ce1a09c1b4360e26 WatchSource:0}: Error finding container 8565ad0488d3be0c81bf24d8c6af2bf80134705edd7f8f77ce1a09c1b4360e26: Status 404 returned error can't find the container with id 8565ad0488d3be0c81bf24d8c6af2bf80134705edd7f8f77ce1a09c1b4360e26 Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.617054 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-fdrzh"] Nov 22 03:10:18 crc kubenswrapper[4952]: W1122 03:10:18.630055 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaea8e22e_d027_4c7e_805d_20ba08d5a87d.slice/crio-fb94a8bcac84623bedb1ed6cd0988eaa1d383eebc65bb72a62043c84b6ecce9e WatchSource:0}: Error finding container fb94a8bcac84623bedb1ed6cd0988eaa1d383eebc65bb72a62043c84b6ecce9e: Status 404 returned error can't find the container with id fb94a8bcac84623bedb1ed6cd0988eaa1d383eebc65bb72a62043c84b6ecce9e Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.643288 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-20da-account-create-hffxc"] Nov 22 03:10:18 crc kubenswrapper[4952]: W1122 03:10:18.673432 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded3b067_1f44_401e_afd4_d10955e87f52.slice/crio-310022470e0e42406fa71a8ac62aa527f0378970fd81ff6bad55aef686b478c4 WatchSource:0}: Error finding container 310022470e0e42406fa71a8ac62aa527f0378970fd81ff6bad55aef686b478c4: Status 404 returned error can't find the container with id 310022470e0e42406fa71a8ac62aa527f0378970fd81ff6bad55aef686b478c4 Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.786997 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c721-account-create-kckf7"] Nov 22 03:10:18 crc kubenswrapper[4952]: W1122 03:10:18.803895 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02579246_918f_41d2_b71c_661abcdb0072.slice/crio-f90123b11696b230e5d1d364e0fd2c9552bec1c0a6902c3e295b951810acbada WatchSource:0}: Error finding container f90123b11696b230e5d1d364e0fd2c9552bec1c0a6902c3e295b951810acbada: Status 404 returned error can't find the container with id f90123b11696b230e5d1d364e0fd2c9552bec1c0a6902c3e295b951810acbada Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.857182 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-pq7nm"] Nov 22 03:10:18 crc kubenswrapper[4952]: W1122 03:10:18.871790 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5725796_1375_41a7_a8d6_80035aabc3d1.slice/crio-85f7ba0ccdadbae4591d88ac6c644cd2d84f9cdc3abfe07d5ce24926a3b39ac9 WatchSource:0}: Error finding container 85f7ba0ccdadbae4591d88ac6c644cd2d84f9cdc3abfe07d5ce24926a3b39ac9: Status 404 returned error can't find the container with id 85f7ba0ccdadbae4591d88ac6c644cd2d84f9cdc3abfe07d5ce24926a3b39ac9 Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.943360 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-r69xs" event={"ID":"afad0208-0c5d-49dd-ac0b-54871b694e7d","Type":"ContainerStarted","Data":"13596658045d96daf3a1e90d777e675aa30a919581aecab26a5befffa0fe2382"} Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.943413 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-r69xs" event={"ID":"afad0208-0c5d-49dd-ac0b-54871b694e7d","Type":"ContainerStarted","Data":"000c9a1068da3b463d15256ff5fb0222a231ac9db1a995510c09d1384620d620"} Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.947065 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fjb79" event={"ID":"5c6ba510-eab8-457f-9103-2f49b46115da","Type":"ContainerStarted","Data":"8565ad0488d3be0c81bf24d8c6af2bf80134705edd7f8f77ce1a09c1b4360e26"} Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.952447 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c721-account-create-kckf7" event={"ID":"02579246-918f-41d2-b71c-661abcdb0072","Type":"ContainerStarted","Data":"f90123b11696b230e5d1d364e0fd2c9552bec1c0a6902c3e295b951810acbada"} Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.964001 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-pq7nm" event={"ID":"c5725796-1375-41a7-a8d6-80035aabc3d1","Type":"ContainerStarted","Data":"85f7ba0ccdadbae4591d88ac6c644cd2d84f9cdc3abfe07d5ce24926a3b39ac9"} Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.968192 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-r69xs" podStartSLOduration=1.968155595 podStartE2EDuration="1.968155595s" podCreationTimestamp="2025-11-22 03:10:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:10:18.965325458 +0000 UTC m=+983.271342731" watchObservedRunningTime="2025-11-22 03:10:18.968155595 +0000 UTC m=+983.274172868" Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.970382 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-20da-account-create-hffxc" event={"ID":"ded3b067-1f44-401e-afd4-d10955e87f52","Type":"ContainerStarted","Data":"310022470e0e42406fa71a8ac62aa527f0378970fd81ff6bad55aef686b478c4"} Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.975644 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2d41-account-create-xj9f4" event={"ID":"bf2f79bb-54b5-45f1-a95e-9923eefb464d","Type":"ContainerStarted","Data":"8288984c2f3d2134bcafef81f360b6197f28dd9e07d4dd73b6f73cc4f2d61557"} Nov 22 03:10:18 crc kubenswrapper[4952]: I1122 03:10:18.980056 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fdrzh" event={"ID":"aea8e22e-d027-4c7e-805d-20ba08d5a87d","Type":"ContainerStarted","Data":"fb94a8bcac84623bedb1ed6cd0988eaa1d383eebc65bb72a62043c84b6ecce9e"} Nov 22 03:10:19 crc kubenswrapper[4952]: I1122 03:10:19.598647 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hcq9n" Nov 22 03:10:19 crc kubenswrapper[4952]: I1122 03:10:19.738147 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r72qt\" (UniqueName: \"kubernetes.io/projected/7dceab59-b76c-4c04-b00f-d81a39fd90ab-kube-api-access-r72qt\") pod \"7dceab59-b76c-4c04-b00f-d81a39fd90ab\" (UID: \"7dceab59-b76c-4c04-b00f-d81a39fd90ab\") " Nov 22 03:10:19 crc kubenswrapper[4952]: I1122 03:10:19.738225 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7dceab59-b76c-4c04-b00f-d81a39fd90ab-db-sync-config-data\") pod \"7dceab59-b76c-4c04-b00f-d81a39fd90ab\" (UID: \"7dceab59-b76c-4c04-b00f-d81a39fd90ab\") " Nov 22 03:10:19 crc kubenswrapper[4952]: I1122 03:10:19.738348 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dceab59-b76c-4c04-b00f-d81a39fd90ab-combined-ca-bundle\") pod \"7dceab59-b76c-4c04-b00f-d81a39fd90ab\" (UID: \"7dceab59-b76c-4c04-b00f-d81a39fd90ab\") " Nov 22 03:10:19 crc kubenswrapper[4952]: I1122 03:10:19.738989 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dceab59-b76c-4c04-b00f-d81a39fd90ab-config-data\") pod \"7dceab59-b76c-4c04-b00f-d81a39fd90ab\" (UID: \"7dceab59-b76c-4c04-b00f-d81a39fd90ab\") " Nov 22 03:10:19 crc kubenswrapper[4952]: I1122 03:10:19.760181 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dceab59-b76c-4c04-b00f-d81a39fd90ab-kube-api-access-r72qt" (OuterVolumeSpecName: "kube-api-access-r72qt") pod "7dceab59-b76c-4c04-b00f-d81a39fd90ab" (UID: "7dceab59-b76c-4c04-b00f-d81a39fd90ab"). InnerVolumeSpecName "kube-api-access-r72qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:19 crc kubenswrapper[4952]: I1122 03:10:19.768854 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dceab59-b76c-4c04-b00f-d81a39fd90ab-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7dceab59-b76c-4c04-b00f-d81a39fd90ab" (UID: "7dceab59-b76c-4c04-b00f-d81a39fd90ab"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:19 crc kubenswrapper[4952]: I1122 03:10:19.787557 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dceab59-b76c-4c04-b00f-d81a39fd90ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dceab59-b76c-4c04-b00f-d81a39fd90ab" (UID: "7dceab59-b76c-4c04-b00f-d81a39fd90ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:19 crc kubenswrapper[4952]: I1122 03:10:19.820924 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dceab59-b76c-4c04-b00f-d81a39fd90ab-config-data" (OuterVolumeSpecName: "config-data") pod "7dceab59-b76c-4c04-b00f-d81a39fd90ab" (UID: "7dceab59-b76c-4c04-b00f-d81a39fd90ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:19 crc kubenswrapper[4952]: I1122 03:10:19.841851 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dceab59-b76c-4c04-b00f-d81a39fd90ab-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:19 crc kubenswrapper[4952]: I1122 03:10:19.841903 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r72qt\" (UniqueName: \"kubernetes.io/projected/7dceab59-b76c-4c04-b00f-d81a39fd90ab-kube-api-access-r72qt\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:19 crc kubenswrapper[4952]: I1122 03:10:19.841951 4952 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7dceab59-b76c-4c04-b00f-d81a39fd90ab-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:19 crc kubenswrapper[4952]: I1122 03:10:19.841964 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dceab59-b76c-4c04-b00f-d81a39fd90ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.017937 4952 generic.go:334] "Generic (PLEG): container finished" podID="5c6ba510-eab8-457f-9103-2f49b46115da" containerID="08ef6326853e97e711274c54bed1096fdec5a682022b7ce93b6e08421de32bbf" exitCode=0 Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.018016 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fjb79" event={"ID":"5c6ba510-eab8-457f-9103-2f49b46115da","Type":"ContainerDied","Data":"08ef6326853e97e711274c54bed1096fdec5a682022b7ce93b6e08421de32bbf"} Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.021760 4952 generic.go:334] "Generic (PLEG): container finished" podID="02579246-918f-41d2-b71c-661abcdb0072" containerID="d21104b47037277978c37d63f95e2bc633183b4c2c5ccb18fa643fdedfc48731" exitCode=0 Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.021825 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c721-account-create-kckf7" event={"ID":"02579246-918f-41d2-b71c-661abcdb0072","Type":"ContainerDied","Data":"d21104b47037277978c37d63f95e2bc633183b4c2c5ccb18fa643fdedfc48731"} Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.024053 4952 generic.go:334] "Generic (PLEG): container finished" podID="ded3b067-1f44-401e-afd4-d10955e87f52" containerID="471bc27e97ddcd2067ad8223a8511faf4e2cb12a2f4c76914925268474cce5c9" exitCode=0 Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.024087 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-20da-account-create-hffxc" event={"ID":"ded3b067-1f44-401e-afd4-d10955e87f52","Type":"ContainerDied","Data":"471bc27e97ddcd2067ad8223a8511faf4e2cb12a2f4c76914925268474cce5c9"} Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.026293 4952 generic.go:334] "Generic (PLEG): container finished" podID="bf2f79bb-54b5-45f1-a95e-9923eefb464d" containerID="03a4dfbceb005964f4300417e04cd67e4788e0f9197bd4f30f38f5db2b23bba9" exitCode=0 Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.026423 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2d41-account-create-xj9f4" event={"ID":"bf2f79bb-54b5-45f1-a95e-9923eefb464d","Type":"ContainerDied","Data":"03a4dfbceb005964f4300417e04cd67e4788e0f9197bd4f30f38f5db2b23bba9"} Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.028919 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hcq9n" event={"ID":"7dceab59-b76c-4c04-b00f-d81a39fd90ab","Type":"ContainerDied","Data":"5c6814edd59a9551a605efd4c0a00048380545ede29a0ce8eb02482b7bf0755d"} Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.028968 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c6814edd59a9551a605efd4c0a00048380545ede29a0ce8eb02482b7bf0755d" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.029081 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hcq9n" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.049822 4952 generic.go:334] "Generic (PLEG): container finished" podID="aea8e22e-d027-4c7e-805d-20ba08d5a87d" containerID="7b78a37dc8ff70c046db65123698eb6cbc658df757704874ffe3e57665f7f5cf" exitCode=0 Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.049959 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fdrzh" event={"ID":"aea8e22e-d027-4c7e-805d-20ba08d5a87d","Type":"ContainerDied","Data":"7b78a37dc8ff70c046db65123698eb6cbc658df757704874ffe3e57665f7f5cf"} Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.055811 4952 generic.go:334] "Generic (PLEG): container finished" podID="afad0208-0c5d-49dd-ac0b-54871b694e7d" containerID="13596658045d96daf3a1e90d777e675aa30a919581aecab26a5befffa0fe2382" exitCode=0 Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.055876 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-r69xs" event={"ID":"afad0208-0c5d-49dd-ac0b-54871b694e7d","Type":"ContainerDied","Data":"13596658045d96daf3a1e90d777e675aa30a919581aecab26a5befffa0fe2382"} Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.320526 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-zxtzz"] Nov 22 03:10:20 crc kubenswrapper[4952]: E1122 03:10:20.320987 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dceab59-b76c-4c04-b00f-d81a39fd90ab" containerName="glance-db-sync" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.321001 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dceab59-b76c-4c04-b00f-d81a39fd90ab" containerName="glance-db-sync" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.321202 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dceab59-b76c-4c04-b00f-d81a39fd90ab" containerName="glance-db-sync" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.322387 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.344085 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-zxtzz"] Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.388965 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-zxtzz\" (UID: \"8fb3d24b-a353-43fa-bef7-a4441d643cad\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.389064 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-zxtzz\" (UID: \"8fb3d24b-a353-43fa-bef7-a4441d643cad\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.389128 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-config\") pod \"dnsmasq-dns-54f9b7b8d9-zxtzz\" (UID: \"8fb3d24b-a353-43fa-bef7-a4441d643cad\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.389147 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-zxtzz\" (UID: \"8fb3d24b-a353-43fa-bef7-a4441d643cad\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.389367 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkkql\" (UniqueName: \"kubernetes.io/projected/8fb3d24b-a353-43fa-bef7-a4441d643cad-kube-api-access-xkkql\") pod \"dnsmasq-dns-54f9b7b8d9-zxtzz\" (UID: \"8fb3d24b-a353-43fa-bef7-a4441d643cad\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.492001 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-zxtzz\" (UID: \"8fb3d24b-a353-43fa-bef7-a4441d643cad\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.492110 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-zxtzz\" (UID: \"8fb3d24b-a353-43fa-bef7-a4441d643cad\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.492188 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-config\") pod \"dnsmasq-dns-54f9b7b8d9-zxtzz\" (UID: \"8fb3d24b-a353-43fa-bef7-a4441d643cad\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.492258 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-zxtzz\" (UID: \"8fb3d24b-a353-43fa-bef7-a4441d643cad\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.492295 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkkql\" (UniqueName: \"kubernetes.io/projected/8fb3d24b-a353-43fa-bef7-a4441d643cad-kube-api-access-xkkql\") pod \"dnsmasq-dns-54f9b7b8d9-zxtzz\" (UID: \"8fb3d24b-a353-43fa-bef7-a4441d643cad\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.493467 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-zxtzz\" (UID: \"8fb3d24b-a353-43fa-bef7-a4441d643cad\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.493500 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-zxtzz\" (UID: \"8fb3d24b-a353-43fa-bef7-a4441d643cad\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.493866 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-zxtzz\" (UID: \"8fb3d24b-a353-43fa-bef7-a4441d643cad\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.494502 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-config\") pod \"dnsmasq-dns-54f9b7b8d9-zxtzz\" (UID: \"8fb3d24b-a353-43fa-bef7-a4441d643cad\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.515844 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkkql\" (UniqueName: \"kubernetes.io/projected/8fb3d24b-a353-43fa-bef7-a4441d643cad-kube-api-access-xkkql\") pod \"dnsmasq-dns-54f9b7b8d9-zxtzz\" (UID: \"8fb3d24b-a353-43fa-bef7-a4441d643cad\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:20 crc kubenswrapper[4952]: I1122 03:10:20.668218 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.206807 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-zxtzz"] Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.418974 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2d41-account-create-xj9f4" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.518713 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsr6j\" (UniqueName: \"kubernetes.io/projected/bf2f79bb-54b5-45f1-a95e-9923eefb464d-kube-api-access-gsr6j\") pod \"bf2f79bb-54b5-45f1-a95e-9923eefb464d\" (UID: \"bf2f79bb-54b5-45f1-a95e-9923eefb464d\") " Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.519347 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf2f79bb-54b5-45f1-a95e-9923eefb464d-operator-scripts\") pod \"bf2f79bb-54b5-45f1-a95e-9923eefb464d\" (UID: \"bf2f79bb-54b5-45f1-a95e-9923eefb464d\") " Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.521108 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf2f79bb-54b5-45f1-a95e-9923eefb464d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf2f79bb-54b5-45f1-a95e-9923eefb464d" (UID: "bf2f79bb-54b5-45f1-a95e-9923eefb464d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.525597 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf2f79bb-54b5-45f1-a95e-9923eefb464d-kube-api-access-gsr6j" (OuterVolumeSpecName: "kube-api-access-gsr6j") pod "bf2f79bb-54b5-45f1-a95e-9923eefb464d" (UID: "bf2f79bb-54b5-45f1-a95e-9923eefb464d"). InnerVolumeSpecName "kube-api-access-gsr6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.616311 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-20da-account-create-hffxc" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.623883 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsr6j\" (UniqueName: \"kubernetes.io/projected/bf2f79bb-54b5-45f1-a95e-9923eefb464d-kube-api-access-gsr6j\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.623917 4952 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf2f79bb-54b5-45f1-a95e-9923eefb464d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.630506 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-r69xs" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.649776 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fdrzh" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.651564 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fjb79" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.660331 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c721-account-create-kckf7" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.725819 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wslpj\" (UniqueName: \"kubernetes.io/projected/02579246-918f-41d2-b71c-661abcdb0072-kube-api-access-wslpj\") pod \"02579246-918f-41d2-b71c-661abcdb0072\" (UID: \"02579246-918f-41d2-b71c-661abcdb0072\") " Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.725876 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp8wb\" (UniqueName: \"kubernetes.io/projected/5c6ba510-eab8-457f-9103-2f49b46115da-kube-api-access-mp8wb\") pod \"5c6ba510-eab8-457f-9103-2f49b46115da\" (UID: \"5c6ba510-eab8-457f-9103-2f49b46115da\") " Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.725978 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt55v\" (UniqueName: \"kubernetes.io/projected/ded3b067-1f44-401e-afd4-d10955e87f52-kube-api-access-mt55v\") pod \"ded3b067-1f44-401e-afd4-d10955e87f52\" (UID: \"ded3b067-1f44-401e-afd4-d10955e87f52\") " Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.726086 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afad0208-0c5d-49dd-ac0b-54871b694e7d-operator-scripts\") pod \"afad0208-0c5d-49dd-ac0b-54871b694e7d\" (UID: \"afad0208-0c5d-49dd-ac0b-54871b694e7d\") " Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.726105 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c6ba510-eab8-457f-9103-2f49b46115da-operator-scripts\") pod \"5c6ba510-eab8-457f-9103-2f49b46115da\" (UID: \"5c6ba510-eab8-457f-9103-2f49b46115da\") " Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.726172 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aea8e22e-d027-4c7e-805d-20ba08d5a87d-operator-scripts\") pod \"aea8e22e-d027-4c7e-805d-20ba08d5a87d\" (UID: \"aea8e22e-d027-4c7e-805d-20ba08d5a87d\") " Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.726215 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl8v2\" (UniqueName: \"kubernetes.io/projected/aea8e22e-d027-4c7e-805d-20ba08d5a87d-kube-api-access-cl8v2\") pod \"aea8e22e-d027-4c7e-805d-20ba08d5a87d\" (UID: \"aea8e22e-d027-4c7e-805d-20ba08d5a87d\") " Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.726249 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ded3b067-1f44-401e-afd4-d10955e87f52-operator-scripts\") pod \"ded3b067-1f44-401e-afd4-d10955e87f52\" (UID: \"ded3b067-1f44-401e-afd4-d10955e87f52\") " Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.726282 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9j2w\" (UniqueName: \"kubernetes.io/projected/afad0208-0c5d-49dd-ac0b-54871b694e7d-kube-api-access-h9j2w\") pod \"afad0208-0c5d-49dd-ac0b-54871b694e7d\" (UID: \"afad0208-0c5d-49dd-ac0b-54871b694e7d\") " Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.726352 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02579246-918f-41d2-b71c-661abcdb0072-operator-scripts\") pod \"02579246-918f-41d2-b71c-661abcdb0072\" (UID: \"02579246-918f-41d2-b71c-661abcdb0072\") " Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.728901 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afad0208-0c5d-49dd-ac0b-54871b694e7d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "afad0208-0c5d-49dd-ac0b-54871b694e7d" (UID: "afad0208-0c5d-49dd-ac0b-54871b694e7d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.729100 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ded3b067-1f44-401e-afd4-d10955e87f52-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ded3b067-1f44-401e-afd4-d10955e87f52" (UID: "ded3b067-1f44-401e-afd4-d10955e87f52"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.729362 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c6ba510-eab8-457f-9103-2f49b46115da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c6ba510-eab8-457f-9103-2f49b46115da" (UID: "5c6ba510-eab8-457f-9103-2f49b46115da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.729405 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aea8e22e-d027-4c7e-805d-20ba08d5a87d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aea8e22e-d027-4c7e-805d-20ba08d5a87d" (UID: "aea8e22e-d027-4c7e-805d-20ba08d5a87d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.729472 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02579246-918f-41d2-b71c-661abcdb0072-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "02579246-918f-41d2-b71c-661abcdb0072" (UID: "02579246-918f-41d2-b71c-661abcdb0072"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.732663 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02579246-918f-41d2-b71c-661abcdb0072-kube-api-access-wslpj" (OuterVolumeSpecName: "kube-api-access-wslpj") pod "02579246-918f-41d2-b71c-661abcdb0072" (UID: "02579246-918f-41d2-b71c-661abcdb0072"). InnerVolumeSpecName "kube-api-access-wslpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.733014 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c6ba510-eab8-457f-9103-2f49b46115da-kube-api-access-mp8wb" (OuterVolumeSpecName: "kube-api-access-mp8wb") pod "5c6ba510-eab8-457f-9103-2f49b46115da" (UID: "5c6ba510-eab8-457f-9103-2f49b46115da"). InnerVolumeSpecName "kube-api-access-mp8wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.734745 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ded3b067-1f44-401e-afd4-d10955e87f52-kube-api-access-mt55v" (OuterVolumeSpecName: "kube-api-access-mt55v") pod "ded3b067-1f44-401e-afd4-d10955e87f52" (UID: "ded3b067-1f44-401e-afd4-d10955e87f52"). InnerVolumeSpecName "kube-api-access-mt55v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.740925 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afad0208-0c5d-49dd-ac0b-54871b694e7d-kube-api-access-h9j2w" (OuterVolumeSpecName: "kube-api-access-h9j2w") pod "afad0208-0c5d-49dd-ac0b-54871b694e7d" (UID: "afad0208-0c5d-49dd-ac0b-54871b694e7d"). InnerVolumeSpecName "kube-api-access-h9j2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.741789 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aea8e22e-d027-4c7e-805d-20ba08d5a87d-kube-api-access-cl8v2" (OuterVolumeSpecName: "kube-api-access-cl8v2") pod "aea8e22e-d027-4c7e-805d-20ba08d5a87d" (UID: "aea8e22e-d027-4c7e-805d-20ba08d5a87d"). InnerVolumeSpecName "kube-api-access-cl8v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.828762 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9j2w\" (UniqueName: \"kubernetes.io/projected/afad0208-0c5d-49dd-ac0b-54871b694e7d-kube-api-access-h9j2w\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.828803 4952 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02579246-918f-41d2-b71c-661abcdb0072-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.828816 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp8wb\" (UniqueName: \"kubernetes.io/projected/5c6ba510-eab8-457f-9103-2f49b46115da-kube-api-access-mp8wb\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.828826 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wslpj\" (UniqueName: \"kubernetes.io/projected/02579246-918f-41d2-b71c-661abcdb0072-kube-api-access-wslpj\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.828834 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt55v\" (UniqueName: \"kubernetes.io/projected/ded3b067-1f44-401e-afd4-d10955e87f52-kube-api-access-mt55v\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.828843 4952 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afad0208-0c5d-49dd-ac0b-54871b694e7d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.828851 4952 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c6ba510-eab8-457f-9103-2f49b46115da-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.828859 4952 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aea8e22e-d027-4c7e-805d-20ba08d5a87d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.828868 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl8v2\" (UniqueName: \"kubernetes.io/projected/aea8e22e-d027-4c7e-805d-20ba08d5a87d-kube-api-access-cl8v2\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:21 crc kubenswrapper[4952]: I1122 03:10:21.828878 4952 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ded3b067-1f44-401e-afd4-d10955e87f52-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.090087 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-r69xs" Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.091837 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-r69xs" event={"ID":"afad0208-0c5d-49dd-ac0b-54871b694e7d","Type":"ContainerDied","Data":"000c9a1068da3b463d15256ff5fb0222a231ac9db1a995510c09d1384620d620"} Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.091930 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="000c9a1068da3b463d15256ff5fb0222a231ac9db1a995510c09d1384620d620" Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.097134 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fjb79" event={"ID":"5c6ba510-eab8-457f-9103-2f49b46115da","Type":"ContainerDied","Data":"8565ad0488d3be0c81bf24d8c6af2bf80134705edd7f8f77ce1a09c1b4360e26"} Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.097192 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8565ad0488d3be0c81bf24d8c6af2bf80134705edd7f8f77ce1a09c1b4360e26" Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.097147 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fjb79" Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.103948 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c721-account-create-kckf7" event={"ID":"02579246-918f-41d2-b71c-661abcdb0072","Type":"ContainerDied","Data":"f90123b11696b230e5d1d364e0fd2c9552bec1c0a6902c3e295b951810acbada"} Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.103995 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f90123b11696b230e5d1d364e0fd2c9552bec1c0a6902c3e295b951810acbada" Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.104057 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c721-account-create-kckf7" Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.107998 4952 generic.go:334] "Generic (PLEG): container finished" podID="8fb3d24b-a353-43fa-bef7-a4441d643cad" containerID="8f4e6c21ffc821abc3ba28be91ea1ccb803c7a23bf66c9cc94c3feb6272f72e8" exitCode=0 Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.108066 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" event={"ID":"8fb3d24b-a353-43fa-bef7-a4441d643cad","Type":"ContainerDied","Data":"8f4e6c21ffc821abc3ba28be91ea1ccb803c7a23bf66c9cc94c3feb6272f72e8"} Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.108089 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" event={"ID":"8fb3d24b-a353-43fa-bef7-a4441d643cad","Type":"ContainerStarted","Data":"160e9231174c7a523bfd10d782562c06b1bcf20ad53d77c17f8fc30ad20d19d7"} Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.115700 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-20da-account-create-hffxc" event={"ID":"ded3b067-1f44-401e-afd4-d10955e87f52","Type":"ContainerDied","Data":"310022470e0e42406fa71a8ac62aa527f0378970fd81ff6bad55aef686b478c4"} Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.115745 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="310022470e0e42406fa71a8ac62aa527f0378970fd81ff6bad55aef686b478c4" Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.115832 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-20da-account-create-hffxc" Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.119874 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2d41-account-create-xj9f4" event={"ID":"bf2f79bb-54b5-45f1-a95e-9923eefb464d","Type":"ContainerDied","Data":"8288984c2f3d2134bcafef81f360b6197f28dd9e07d4dd73b6f73cc4f2d61557"} Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.119936 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8288984c2f3d2134bcafef81f360b6197f28dd9e07d4dd73b6f73cc4f2d61557" Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.120009 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2d41-account-create-xj9f4" Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.138961 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fdrzh" Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.138882 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fdrzh" event={"ID":"aea8e22e-d027-4c7e-805d-20ba08d5a87d","Type":"ContainerDied","Data":"fb94a8bcac84623bedb1ed6cd0988eaa1d383eebc65bb72a62043c84b6ecce9e"} Nov 22 03:10:22 crc kubenswrapper[4952]: I1122 03:10:22.139739 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb94a8bcac84623bedb1ed6cd0988eaa1d383eebc65bb72a62043c84b6ecce9e" Nov 22 03:10:22 crc kubenswrapper[4952]: E1122 03:10:22.285911 4952 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c6ba510_eab8_457f_9103_2f49b46115da.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafad0208_0c5d_49dd_ac0b_54871b694e7d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaea8e22e_d027_4c7e_805d_20ba08d5a87d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafad0208_0c5d_49dd_ac0b_54871b694e7d.slice/crio-000c9a1068da3b463d15256ff5fb0222a231ac9db1a995510c09d1384620d620\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf2f79bb_54b5_45f1_a95e_9923eefb464d.slice\": RecentStats: unable to find data in memory cache]" Nov 22 03:10:26 crc kubenswrapper[4952]: I1122 03:10:26.183626 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" event={"ID":"8fb3d24b-a353-43fa-bef7-a4441d643cad","Type":"ContainerStarted","Data":"6cbe39e0a4721cdbd1831b381aeb5d0757e82baa6484579b3414079256470238"} Nov 22 03:10:26 crc kubenswrapper[4952]: I1122 03:10:26.184249 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:26 crc kubenswrapper[4952]: I1122 03:10:26.186238 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-pq7nm" event={"ID":"c5725796-1375-41a7-a8d6-80035aabc3d1","Type":"ContainerStarted","Data":"9b86a7d5b15cea6565ffd2161d277be665d71dcf56084346cf00d31accf620ad"} Nov 22 03:10:26 crc kubenswrapper[4952]: I1122 03:10:26.222369 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" podStartSLOduration=6.222348327 podStartE2EDuration="6.222348327s" podCreationTimestamp="2025-11-22 03:10:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:10:26.21131603 +0000 UTC m=+990.517333323" watchObservedRunningTime="2025-11-22 03:10:26.222348327 +0000 UTC m=+990.528365600" Nov 22 03:10:26 crc kubenswrapper[4952]: I1122 03:10:26.254812 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-pq7nm" podStartSLOduration=2.976450235 podStartE2EDuration="9.254765241s" podCreationTimestamp="2025-11-22 03:10:17 +0000 UTC" firstStartedPulling="2025-11-22 03:10:18.87597338 +0000 UTC m=+983.181990653" lastFinishedPulling="2025-11-22 03:10:25.154288386 +0000 UTC m=+989.460305659" observedRunningTime="2025-11-22 03:10:26.239772786 +0000 UTC m=+990.545790059" watchObservedRunningTime="2025-11-22 03:10:26.254765241 +0000 UTC m=+990.560782594" Nov 22 03:10:30 crc kubenswrapper[4952]: I1122 03:10:30.234423 4952 generic.go:334] "Generic (PLEG): container finished" podID="c5725796-1375-41a7-a8d6-80035aabc3d1" containerID="9b86a7d5b15cea6565ffd2161d277be665d71dcf56084346cf00d31accf620ad" exitCode=0 Nov 22 03:10:30 crc kubenswrapper[4952]: I1122 03:10:30.234451 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-pq7nm" event={"ID":"c5725796-1375-41a7-a8d6-80035aabc3d1","Type":"ContainerDied","Data":"9b86a7d5b15cea6565ffd2161d277be665d71dcf56084346cf00d31accf620ad"} Nov 22 03:10:30 crc kubenswrapper[4952]: I1122 03:10:30.670790 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:30 crc kubenswrapper[4952]: I1122 03:10:30.756905 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zrl6p"] Nov 22 03:10:30 crc kubenswrapper[4952]: I1122 03:10:30.757305 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" podUID="d536449d-9892-4188-81e1-62b5c87627b1" containerName="dnsmasq-dns" containerID="cri-o://1a24ed5e9e0ca403572dd57ce7f7eeaa09afe8f2fcfd280cb1f7fe26dfbc4c41" gracePeriod=10 Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.250808 4952 generic.go:334] "Generic (PLEG): container finished" podID="d536449d-9892-4188-81e1-62b5c87627b1" containerID="1a24ed5e9e0ca403572dd57ce7f7eeaa09afe8f2fcfd280cb1f7fe26dfbc4c41" exitCode=0 Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.250892 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" event={"ID":"d536449d-9892-4188-81e1-62b5c87627b1","Type":"ContainerDied","Data":"1a24ed5e9e0ca403572dd57ce7f7eeaa09afe8f2fcfd280cb1f7fe26dfbc4c41"} Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.522423 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.618154 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-pq7nm" Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.632761 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5725796-1375-41a7-a8d6-80035aabc3d1-combined-ca-bundle\") pod \"c5725796-1375-41a7-a8d6-80035aabc3d1\" (UID: \"c5725796-1375-41a7-a8d6-80035aabc3d1\") " Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.632816 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6vhx\" (UniqueName: \"kubernetes.io/projected/d536449d-9892-4188-81e1-62b5c87627b1-kube-api-access-g6vhx\") pod \"d536449d-9892-4188-81e1-62b5c87627b1\" (UID: \"d536449d-9892-4188-81e1-62b5c87627b1\") " Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.632856 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-ovsdbserver-sb\") pod \"d536449d-9892-4188-81e1-62b5c87627b1\" (UID: \"d536449d-9892-4188-81e1-62b5c87627b1\") " Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.632910 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-ovsdbserver-nb\") pod \"d536449d-9892-4188-81e1-62b5c87627b1\" (UID: \"d536449d-9892-4188-81e1-62b5c87627b1\") " Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.632956 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmsnc\" (UniqueName: \"kubernetes.io/projected/c5725796-1375-41a7-a8d6-80035aabc3d1-kube-api-access-gmsnc\") pod \"c5725796-1375-41a7-a8d6-80035aabc3d1\" (UID: \"c5725796-1375-41a7-a8d6-80035aabc3d1\") " Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.633000 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-dns-svc\") pod \"d536449d-9892-4188-81e1-62b5c87627b1\" (UID: \"d536449d-9892-4188-81e1-62b5c87627b1\") " Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.633038 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-config\") pod \"d536449d-9892-4188-81e1-62b5c87627b1\" (UID: \"d536449d-9892-4188-81e1-62b5c87627b1\") " Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.633071 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5725796-1375-41a7-a8d6-80035aabc3d1-config-data\") pod \"c5725796-1375-41a7-a8d6-80035aabc3d1\" (UID: \"c5725796-1375-41a7-a8d6-80035aabc3d1\") " Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.640366 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5725796-1375-41a7-a8d6-80035aabc3d1-kube-api-access-gmsnc" (OuterVolumeSpecName: "kube-api-access-gmsnc") pod "c5725796-1375-41a7-a8d6-80035aabc3d1" (UID: "c5725796-1375-41a7-a8d6-80035aabc3d1"). InnerVolumeSpecName "kube-api-access-gmsnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.640620 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d536449d-9892-4188-81e1-62b5c87627b1-kube-api-access-g6vhx" (OuterVolumeSpecName: "kube-api-access-g6vhx") pod "d536449d-9892-4188-81e1-62b5c87627b1" (UID: "d536449d-9892-4188-81e1-62b5c87627b1"). InnerVolumeSpecName "kube-api-access-g6vhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.671193 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5725796-1375-41a7-a8d6-80035aabc3d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5725796-1375-41a7-a8d6-80035aabc3d1" (UID: "c5725796-1375-41a7-a8d6-80035aabc3d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.682020 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-config" (OuterVolumeSpecName: "config") pod "d536449d-9892-4188-81e1-62b5c87627b1" (UID: "d536449d-9892-4188-81e1-62b5c87627b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.683077 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d536449d-9892-4188-81e1-62b5c87627b1" (UID: "d536449d-9892-4188-81e1-62b5c87627b1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.684209 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d536449d-9892-4188-81e1-62b5c87627b1" (UID: "d536449d-9892-4188-81e1-62b5c87627b1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.685572 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d536449d-9892-4188-81e1-62b5c87627b1" (UID: "d536449d-9892-4188-81e1-62b5c87627b1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.707734 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5725796-1375-41a7-a8d6-80035aabc3d1-config-data" (OuterVolumeSpecName: "config-data") pod "c5725796-1375-41a7-a8d6-80035aabc3d1" (UID: "c5725796-1375-41a7-a8d6-80035aabc3d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.736653 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmsnc\" (UniqueName: \"kubernetes.io/projected/c5725796-1375-41a7-a8d6-80035aabc3d1-kube-api-access-gmsnc\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.736728 4952 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.736739 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.736753 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5725796-1375-41a7-a8d6-80035aabc3d1-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.736769 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5725796-1375-41a7-a8d6-80035aabc3d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.736782 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6vhx\" (UniqueName: \"kubernetes.io/projected/d536449d-9892-4188-81e1-62b5c87627b1-kube-api-access-g6vhx\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.736795 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:31 crc kubenswrapper[4952]: I1122 03:10:31.736807 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d536449d-9892-4188-81e1-62b5c87627b1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.262447 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.262493 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zrl6p" event={"ID":"d536449d-9892-4188-81e1-62b5c87627b1","Type":"ContainerDied","Data":"aa4c263fd382c54581956d24e33cbb1450240c937c1f54ac0a4821b538afed39"} Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.262711 4952 scope.go:117] "RemoveContainer" containerID="1a24ed5e9e0ca403572dd57ce7f7eeaa09afe8f2fcfd280cb1f7fe26dfbc4c41" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.265698 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-pq7nm" event={"ID":"c5725796-1375-41a7-a8d6-80035aabc3d1","Type":"ContainerDied","Data":"85f7ba0ccdadbae4591d88ac6c644cd2d84f9cdc3abfe07d5ce24926a3b39ac9"} Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.265731 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-pq7nm" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.265752 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85f7ba0ccdadbae4591d88ac6c644cd2d84f9cdc3abfe07d5ce24926a3b39ac9" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.318450 4952 scope.go:117] "RemoveContainer" containerID="eb86b867aa978309b9fb36a59345c210c87ff6e3adeb197ff12337e93d59e298" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.350622 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zrl6p"] Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.363731 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zrl6p"] Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.552813 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d536449d-9892-4188-81e1-62b5c87627b1" path="/var/lib/kubelet/pods/d536449d-9892-4188-81e1-62b5c87627b1/volumes" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.553715 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-lw6pd"] Nov 22 03:10:32 crc kubenswrapper[4952]: E1122 03:10:32.554183 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02579246-918f-41d2-b71c-661abcdb0072" containerName="mariadb-account-create" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.554211 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="02579246-918f-41d2-b71c-661abcdb0072" containerName="mariadb-account-create" Nov 22 03:10:32 crc kubenswrapper[4952]: E1122 03:10:32.554249 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c6ba510-eab8-457f-9103-2f49b46115da" containerName="mariadb-database-create" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.554260 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6ba510-eab8-457f-9103-2f49b46115da" containerName="mariadb-database-create" Nov 22 03:10:32 crc kubenswrapper[4952]: E1122 03:10:32.554274 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d536449d-9892-4188-81e1-62b5c87627b1" containerName="dnsmasq-dns" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.554282 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="d536449d-9892-4188-81e1-62b5c87627b1" containerName="dnsmasq-dns" Nov 22 03:10:32 crc kubenswrapper[4952]: E1122 03:10:32.554294 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded3b067-1f44-401e-afd4-d10955e87f52" containerName="mariadb-account-create" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.554302 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded3b067-1f44-401e-afd4-d10955e87f52" containerName="mariadb-account-create" Nov 22 03:10:32 crc kubenswrapper[4952]: E1122 03:10:32.554320 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf2f79bb-54b5-45f1-a95e-9923eefb464d" containerName="mariadb-account-create" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.554330 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2f79bb-54b5-45f1-a95e-9923eefb464d" containerName="mariadb-account-create" Nov 22 03:10:32 crc kubenswrapper[4952]: E1122 03:10:32.554346 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea8e22e-d027-4c7e-805d-20ba08d5a87d" containerName="mariadb-database-create" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.554355 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea8e22e-d027-4c7e-805d-20ba08d5a87d" containerName="mariadb-database-create" Nov 22 03:10:32 crc kubenswrapper[4952]: E1122 03:10:32.554370 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5725796-1375-41a7-a8d6-80035aabc3d1" containerName="keystone-db-sync" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.554380 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5725796-1375-41a7-a8d6-80035aabc3d1" containerName="keystone-db-sync" Nov 22 03:10:32 crc kubenswrapper[4952]: E1122 03:10:32.554396 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d536449d-9892-4188-81e1-62b5c87627b1" containerName="init" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.554405 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="d536449d-9892-4188-81e1-62b5c87627b1" containerName="init" Nov 22 03:10:32 crc kubenswrapper[4952]: E1122 03:10:32.554425 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afad0208-0c5d-49dd-ac0b-54871b694e7d" containerName="mariadb-database-create" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.554434 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="afad0208-0c5d-49dd-ac0b-54871b694e7d" containerName="mariadb-database-create" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.558024 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="afad0208-0c5d-49dd-ac0b-54871b694e7d" containerName="mariadb-database-create" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.558463 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="02579246-918f-41d2-b71c-661abcdb0072" containerName="mariadb-account-create" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.558534 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="d536449d-9892-4188-81e1-62b5c87627b1" containerName="dnsmasq-dns" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.558562 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf2f79bb-54b5-45f1-a95e-9923eefb464d" containerName="mariadb-account-create" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.558592 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c6ba510-eab8-457f-9103-2f49b46115da" containerName="mariadb-database-create" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.558607 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="ded3b067-1f44-401e-afd4-d10955e87f52" containerName="mariadb-account-create" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.558628 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea8e22e-d027-4c7e-805d-20ba08d5a87d" containerName="mariadb-database-create" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.558652 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5725796-1375-41a7-a8d6-80035aabc3d1" containerName="keystone-db-sync" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.560308 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.576124 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-pwx9z"] Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.577517 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.584285 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.584522 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.584681 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nwhtr" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.584727 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.584755 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.597324 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-lw6pd"] Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.614583 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pwx9z"] Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.657258 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdqdc\" (UniqueName: \"kubernetes.io/projected/68636d65-d61f-4a83-95e2-1ef07aa6c362-kube-api-access-wdqdc\") pod \"dnsmasq-dns-6546db6db7-lw6pd\" (UID: \"68636d65-d61f-4a83-95e2-1ef07aa6c362\") " pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.657433 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-fernet-keys\") pod \"keystone-bootstrap-pwx9z\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.657465 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-credential-keys\") pod \"keystone-bootstrap-pwx9z\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.657501 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-config\") pod \"dnsmasq-dns-6546db6db7-lw6pd\" (UID: \"68636d65-d61f-4a83-95e2-1ef07aa6c362\") " pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.657527 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrkwf\" (UniqueName: \"kubernetes.io/projected/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-kube-api-access-mrkwf\") pod \"keystone-bootstrap-pwx9z\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.657570 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-config-data\") pod \"keystone-bootstrap-pwx9z\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.657596 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-lw6pd\" (UID: \"68636d65-d61f-4a83-95e2-1ef07aa6c362\") " pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.657628 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-scripts\") pod \"keystone-bootstrap-pwx9z\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.657662 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-lw6pd\" (UID: \"68636d65-d61f-4a83-95e2-1ef07aa6c362\") " pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.657689 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-dns-svc\") pod \"dnsmasq-dns-6546db6db7-lw6pd\" (UID: \"68636d65-d61f-4a83-95e2-1ef07aa6c362\") " pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.657733 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-combined-ca-bundle\") pod \"keystone-bootstrap-pwx9z\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.760931 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-combined-ca-bundle\") pod \"keystone-bootstrap-pwx9z\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.761014 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdqdc\" (UniqueName: \"kubernetes.io/projected/68636d65-d61f-4a83-95e2-1ef07aa6c362-kube-api-access-wdqdc\") pod \"dnsmasq-dns-6546db6db7-lw6pd\" (UID: \"68636d65-d61f-4a83-95e2-1ef07aa6c362\") " pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.761108 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-fernet-keys\") pod \"keystone-bootstrap-pwx9z\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.761163 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-credential-keys\") pod \"keystone-bootstrap-pwx9z\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.761194 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-config\") pod \"dnsmasq-dns-6546db6db7-lw6pd\" (UID: \"68636d65-d61f-4a83-95e2-1ef07aa6c362\") " pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.761334 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrkwf\" (UniqueName: \"kubernetes.io/projected/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-kube-api-access-mrkwf\") pod \"keystone-bootstrap-pwx9z\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.761354 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-config-data\") pod \"keystone-bootstrap-pwx9z\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.761405 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-lw6pd\" (UID: \"68636d65-d61f-4a83-95e2-1ef07aa6c362\") " pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.761434 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-scripts\") pod \"keystone-bootstrap-pwx9z\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.761495 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-lw6pd\" (UID: \"68636d65-d61f-4a83-95e2-1ef07aa6c362\") " pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.761575 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-dns-svc\") pod \"dnsmasq-dns-6546db6db7-lw6pd\" (UID: \"68636d65-d61f-4a83-95e2-1ef07aa6c362\") " pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.763562 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-config\") pod \"dnsmasq-dns-6546db6db7-lw6pd\" (UID: \"68636d65-d61f-4a83-95e2-1ef07aa6c362\") " pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.765468 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-lw6pd\" (UID: \"68636d65-d61f-4a83-95e2-1ef07aa6c362\") " pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.766712 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-lw6pd\" (UID: \"68636d65-d61f-4a83-95e2-1ef07aa6c362\") " pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.766929 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-dns-svc\") pod \"dnsmasq-dns-6546db6db7-lw6pd\" (UID: \"68636d65-d61f-4a83-95e2-1ef07aa6c362\") " pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.805943 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-fernet-keys\") pod \"keystone-bootstrap-pwx9z\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.808410 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-scripts\") pod \"keystone-bootstrap-pwx9z\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.808449 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-combined-ca-bundle\") pod \"keystone-bootstrap-pwx9z\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.808825 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-credential-keys\") pod \"keystone-bootstrap-pwx9z\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.826870 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrkwf\" (UniqueName: \"kubernetes.io/projected/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-kube-api-access-mrkwf\") pod \"keystone-bootstrap-pwx9z\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.839574 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdqdc\" (UniqueName: \"kubernetes.io/projected/68636d65-d61f-4a83-95e2-1ef07aa6c362-kube-api-access-wdqdc\") pod \"dnsmasq-dns-6546db6db7-lw6pd\" (UID: \"68636d65-d61f-4a83-95e2-1ef07aa6c362\") " pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.852385 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-config-data\") pod \"keystone-bootstrap-pwx9z\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.897914 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-msxp9"] Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.899114 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.900922 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-msxp9" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.936120 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.937427 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-qv46f" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.939445 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.951342 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:32 crc kubenswrapper[4952]: I1122 03:10:32.963736 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.006216 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.006393 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.016467 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.016730 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.035230 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-msxp9"] Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.059878 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-9bjtf"] Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.061494 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9bjtf" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.065834 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.066186 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-rlg7g" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.066400 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.070323 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-db-sync-config-data\") pod \"cinder-db-sync-msxp9\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " pod="openstack/cinder-db-sync-msxp9" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.070383 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-combined-ca-bundle\") pod \"cinder-db-sync-msxp9\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " pod="openstack/cinder-db-sync-msxp9" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.070410 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-etc-machine-id\") pod \"cinder-db-sync-msxp9\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " pod="openstack/cinder-db-sync-msxp9" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.070443 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-scripts\") pod \"cinder-db-sync-msxp9\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " pod="openstack/cinder-db-sync-msxp9" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.070483 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcfsp\" (UniqueName: \"kubernetes.io/projected/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-kube-api-access-qcfsp\") pod \"cinder-db-sync-msxp9\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " pod="openstack/cinder-db-sync-msxp9" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.070516 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-config-data\") pod \"cinder-db-sync-msxp9\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " pod="openstack/cinder-db-sync-msxp9" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.079372 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9bjtf"] Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.096462 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-s7fqh"] Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.097956 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s7fqh" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.109075 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-lw6pd"] Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.113428 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.113921 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-c6sc6" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.114254 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.116214 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-s7fqh"] Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.140143 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-8zw6r"] Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.141706 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8zw6r" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.144749 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.144970 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-pwbhn" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.160417 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8zw6r"] Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175061 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-config-data\") pod \"cinder-db-sync-msxp9\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " pod="openstack/cinder-db-sync-msxp9" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175112 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175191 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt46v\" (UniqueName: \"kubernetes.io/projected/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-kube-api-access-tt46v\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175214 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0628cb-3a91-4838-9eeb-dc8b9087969e-scripts\") pod \"placement-db-sync-s7fqh\" (UID: \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\") " pod="openstack/placement-db-sync-s7fqh" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175245 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-run-httpd\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175266 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-config-data\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175291 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-log-httpd\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175317 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-db-sync-config-data\") pod \"cinder-db-sync-msxp9\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " pod="openstack/cinder-db-sync-msxp9" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175334 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0628cb-3a91-4838-9eeb-dc8b9087969e-config-data\") pod \"placement-db-sync-s7fqh\" (UID: \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\") " pod="openstack/placement-db-sync-s7fqh" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175353 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0628cb-3a91-4838-9eeb-dc8b9087969e-logs\") pod \"placement-db-sync-s7fqh\" (UID: \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\") " pod="openstack/placement-db-sync-s7fqh" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175448 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-combined-ca-bundle\") pod \"cinder-db-sync-msxp9\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " pod="openstack/cinder-db-sync-msxp9" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175483 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-etc-machine-id\") pod \"cinder-db-sync-msxp9\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " pod="openstack/cinder-db-sync-msxp9" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175501 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-scripts\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175524 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd2ck\" (UniqueName: \"kubernetes.io/projected/89e1e7cf-32af-45ef-b1bd-36fb741a1ffb-kube-api-access-wd2ck\") pod \"neutron-db-sync-9bjtf\" (UID: \"89e1e7cf-32af-45ef-b1bd-36fb741a1ffb\") " pod="openstack/neutron-db-sync-9bjtf" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175562 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht4vj\" (UniqueName: \"kubernetes.io/projected/fe0628cb-3a91-4838-9eeb-dc8b9087969e-kube-api-access-ht4vj\") pod \"placement-db-sync-s7fqh\" (UID: \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\") " pod="openstack/placement-db-sync-s7fqh" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175594 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-scripts\") pod \"cinder-db-sync-msxp9\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " pod="openstack/cinder-db-sync-msxp9" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175612 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175644 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89e1e7cf-32af-45ef-b1bd-36fb741a1ffb-config\") pod \"neutron-db-sync-9bjtf\" (UID: \"89e1e7cf-32af-45ef-b1bd-36fb741a1ffb\") " pod="openstack/neutron-db-sync-9bjtf" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175673 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89e1e7cf-32af-45ef-b1bd-36fb741a1ffb-combined-ca-bundle\") pod \"neutron-db-sync-9bjtf\" (UID: \"89e1e7cf-32af-45ef-b1bd-36fb741a1ffb\") " pod="openstack/neutron-db-sync-9bjtf" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175708 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcfsp\" (UniqueName: \"kubernetes.io/projected/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-kube-api-access-qcfsp\") pod \"cinder-db-sync-msxp9\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " pod="openstack/cinder-db-sync-msxp9" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.175727 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0628cb-3a91-4838-9eeb-dc8b9087969e-combined-ca-bundle\") pod \"placement-db-sync-s7fqh\" (UID: \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\") " pod="openstack/placement-db-sync-s7fqh" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.187599 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-etc-machine-id\") pod \"cinder-db-sync-msxp9\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " pod="openstack/cinder-db-sync-msxp9" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.197172 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-g7scg"] Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.200703 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.205245 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-g7scg"] Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.216376 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-db-sync-config-data\") pod \"cinder-db-sync-msxp9\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " pod="openstack/cinder-db-sync-msxp9" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.218231 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-combined-ca-bundle\") pod \"cinder-db-sync-msxp9\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " pod="openstack/cinder-db-sync-msxp9" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.218783 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-config-data\") pod \"cinder-db-sync-msxp9\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " pod="openstack/cinder-db-sync-msxp9" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.222615 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcfsp\" (UniqueName: \"kubernetes.io/projected/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-kube-api-access-qcfsp\") pod \"cinder-db-sync-msxp9\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " pod="openstack/cinder-db-sync-msxp9" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.223082 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-scripts\") pod \"cinder-db-sync-msxp9\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " pod="openstack/cinder-db-sync-msxp9" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.277492 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.277584 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-g7scg\" (UID: \"985ac08f-2330-454b-94ba-a78dd7f376e0\") " pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.277621 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z67v\" (UniqueName: \"kubernetes.io/projected/90966d16-8b8d-461f-9bb9-827f0d8cd48b-kube-api-access-5z67v\") pod \"barbican-db-sync-8zw6r\" (UID: \"90966d16-8b8d-461f-9bb9-827f0d8cd48b\") " pod="openstack/barbican-db-sync-8zw6r" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.277639 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-g7scg\" (UID: \"985ac08f-2330-454b-94ba-a78dd7f376e0\") " pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.277661 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt46v\" (UniqueName: \"kubernetes.io/projected/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-kube-api-access-tt46v\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.277691 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0628cb-3a91-4838-9eeb-dc8b9087969e-scripts\") pod \"placement-db-sync-s7fqh\" (UID: \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\") " pod="openstack/placement-db-sync-s7fqh" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.277721 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbp5q\" (UniqueName: \"kubernetes.io/projected/985ac08f-2330-454b-94ba-a78dd7f376e0-kube-api-access-bbp5q\") pod \"dnsmasq-dns-7987f74bbc-g7scg\" (UID: \"985ac08f-2330-454b-94ba-a78dd7f376e0\") " pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.277742 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-run-httpd\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.277759 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-config-data\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.277780 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-log-httpd\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.277799 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90966d16-8b8d-461f-9bb9-827f0d8cd48b-combined-ca-bundle\") pod \"barbican-db-sync-8zw6r\" (UID: \"90966d16-8b8d-461f-9bb9-827f0d8cd48b\") " pod="openstack/barbican-db-sync-8zw6r" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.277828 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0628cb-3a91-4838-9eeb-dc8b9087969e-config-data\") pod \"placement-db-sync-s7fqh\" (UID: \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\") " pod="openstack/placement-db-sync-s7fqh" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.277843 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0628cb-3a91-4838-9eeb-dc8b9087969e-logs\") pod \"placement-db-sync-s7fqh\" (UID: \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\") " pod="openstack/placement-db-sync-s7fqh" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.277881 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-scripts\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.277903 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd2ck\" (UniqueName: \"kubernetes.io/projected/89e1e7cf-32af-45ef-b1bd-36fb741a1ffb-kube-api-access-wd2ck\") pod \"neutron-db-sync-9bjtf\" (UID: \"89e1e7cf-32af-45ef-b1bd-36fb741a1ffb\") " pod="openstack/neutron-db-sync-9bjtf" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.277925 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht4vj\" (UniqueName: \"kubernetes.io/projected/fe0628cb-3a91-4838-9eeb-dc8b9087969e-kube-api-access-ht4vj\") pod \"placement-db-sync-s7fqh\" (UID: \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\") " pod="openstack/placement-db-sync-s7fqh" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.277948 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.277973 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-config\") pod \"dnsmasq-dns-7987f74bbc-g7scg\" (UID: \"985ac08f-2330-454b-94ba-a78dd7f376e0\") " pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.277993 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89e1e7cf-32af-45ef-b1bd-36fb741a1ffb-config\") pod \"neutron-db-sync-9bjtf\" (UID: \"89e1e7cf-32af-45ef-b1bd-36fb741a1ffb\") " pod="openstack/neutron-db-sync-9bjtf" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.278014 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-g7scg\" (UID: \"985ac08f-2330-454b-94ba-a78dd7f376e0\") " pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.278036 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89e1e7cf-32af-45ef-b1bd-36fb741a1ffb-combined-ca-bundle\") pod \"neutron-db-sync-9bjtf\" (UID: \"89e1e7cf-32af-45ef-b1bd-36fb741a1ffb\") " pod="openstack/neutron-db-sync-9bjtf" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.278069 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0628cb-3a91-4838-9eeb-dc8b9087969e-combined-ca-bundle\") pod \"placement-db-sync-s7fqh\" (UID: \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\") " pod="openstack/placement-db-sync-s7fqh" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.278093 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/90966d16-8b8d-461f-9bb9-827f0d8cd48b-db-sync-config-data\") pod \"barbican-db-sync-8zw6r\" (UID: \"90966d16-8b8d-461f-9bb9-827f0d8cd48b\") " pod="openstack/barbican-db-sync-8zw6r" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.287909 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.290934 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-log-httpd\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.291179 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-run-httpd\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.294794 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0628cb-3a91-4838-9eeb-dc8b9087969e-scripts\") pod \"placement-db-sync-s7fqh\" (UID: \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\") " pod="openstack/placement-db-sync-s7fqh" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.296270 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0628cb-3a91-4838-9eeb-dc8b9087969e-logs\") pod \"placement-db-sync-s7fqh\" (UID: \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\") " pod="openstack/placement-db-sync-s7fqh" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.297711 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-config-data\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.297994 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.299251 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/89e1e7cf-32af-45ef-b1bd-36fb741a1ffb-config\") pod \"neutron-db-sync-9bjtf\" (UID: \"89e1e7cf-32af-45ef-b1bd-36fb741a1ffb\") " pod="openstack/neutron-db-sync-9bjtf" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.299930 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-scripts\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.304398 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0628cb-3a91-4838-9eeb-dc8b9087969e-config-data\") pod \"placement-db-sync-s7fqh\" (UID: \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\") " pod="openstack/placement-db-sync-s7fqh" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.305232 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89e1e7cf-32af-45ef-b1bd-36fb741a1ffb-combined-ca-bundle\") pod \"neutron-db-sync-9bjtf\" (UID: \"89e1e7cf-32af-45ef-b1bd-36fb741a1ffb\") " pod="openstack/neutron-db-sync-9bjtf" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.309954 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0628cb-3a91-4838-9eeb-dc8b9087969e-combined-ca-bundle\") pod \"placement-db-sync-s7fqh\" (UID: \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\") " pod="openstack/placement-db-sync-s7fqh" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.315174 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt46v\" (UniqueName: \"kubernetes.io/projected/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-kube-api-access-tt46v\") pod \"ceilometer-0\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.321138 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht4vj\" (UniqueName: \"kubernetes.io/projected/fe0628cb-3a91-4838-9eeb-dc8b9087969e-kube-api-access-ht4vj\") pod \"placement-db-sync-s7fqh\" (UID: \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\") " pod="openstack/placement-db-sync-s7fqh" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.325701 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd2ck\" (UniqueName: \"kubernetes.io/projected/89e1e7cf-32af-45ef-b1bd-36fb741a1ffb-kube-api-access-wd2ck\") pod \"neutron-db-sync-9bjtf\" (UID: \"89e1e7cf-32af-45ef-b1bd-36fb741a1ffb\") " pod="openstack/neutron-db-sync-9bjtf" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.379842 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-g7scg\" (UID: \"985ac08f-2330-454b-94ba-a78dd7f376e0\") " pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.379921 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/90966d16-8b8d-461f-9bb9-827f0d8cd48b-db-sync-config-data\") pod \"barbican-db-sync-8zw6r\" (UID: \"90966d16-8b8d-461f-9bb9-827f0d8cd48b\") " pod="openstack/barbican-db-sync-8zw6r" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.379973 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-g7scg\" (UID: \"985ac08f-2330-454b-94ba-a78dd7f376e0\") " pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.380004 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z67v\" (UniqueName: \"kubernetes.io/projected/90966d16-8b8d-461f-9bb9-827f0d8cd48b-kube-api-access-5z67v\") pod \"barbican-db-sync-8zw6r\" (UID: \"90966d16-8b8d-461f-9bb9-827f0d8cd48b\") " pod="openstack/barbican-db-sync-8zw6r" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.380023 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-g7scg\" (UID: \"985ac08f-2330-454b-94ba-a78dd7f376e0\") " pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.380049 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbp5q\" (UniqueName: \"kubernetes.io/projected/985ac08f-2330-454b-94ba-a78dd7f376e0-kube-api-access-bbp5q\") pod \"dnsmasq-dns-7987f74bbc-g7scg\" (UID: \"985ac08f-2330-454b-94ba-a78dd7f376e0\") " pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.380078 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90966d16-8b8d-461f-9bb9-827f0d8cd48b-combined-ca-bundle\") pod \"barbican-db-sync-8zw6r\" (UID: \"90966d16-8b8d-461f-9bb9-827f0d8cd48b\") " pod="openstack/barbican-db-sync-8zw6r" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.380133 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-config\") pod \"dnsmasq-dns-7987f74bbc-g7scg\" (UID: \"985ac08f-2330-454b-94ba-a78dd7f376e0\") " pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.380904 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-g7scg\" (UID: \"985ac08f-2330-454b-94ba-a78dd7f376e0\") " pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.381295 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-g7scg\" (UID: \"985ac08f-2330-454b-94ba-a78dd7f376e0\") " pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.381844 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-config\") pod \"dnsmasq-dns-7987f74bbc-g7scg\" (UID: \"985ac08f-2330-454b-94ba-a78dd7f376e0\") " pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.382368 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-g7scg\" (UID: \"985ac08f-2330-454b-94ba-a78dd7f376e0\") " pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.388201 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/90966d16-8b8d-461f-9bb9-827f0d8cd48b-db-sync-config-data\") pod \"barbican-db-sync-8zw6r\" (UID: \"90966d16-8b8d-461f-9bb9-827f0d8cd48b\") " pod="openstack/barbican-db-sync-8zw6r" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.388373 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90966d16-8b8d-461f-9bb9-827f0d8cd48b-combined-ca-bundle\") pod \"barbican-db-sync-8zw6r\" (UID: \"90966d16-8b8d-461f-9bb9-827f0d8cd48b\") " pod="openstack/barbican-db-sync-8zw6r" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.402463 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbp5q\" (UniqueName: \"kubernetes.io/projected/985ac08f-2330-454b-94ba-a78dd7f376e0-kube-api-access-bbp5q\") pod \"dnsmasq-dns-7987f74bbc-g7scg\" (UID: \"985ac08f-2330-454b-94ba-a78dd7f376e0\") " pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.402928 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z67v\" (UniqueName: \"kubernetes.io/projected/90966d16-8b8d-461f-9bb9-827f0d8cd48b-kube-api-access-5z67v\") pod \"barbican-db-sync-8zw6r\" (UID: \"90966d16-8b8d-461f-9bb9-827f0d8cd48b\") " pod="openstack/barbican-db-sync-8zw6r" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.444486 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-msxp9" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.495605 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.526991 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9bjtf" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.545047 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s7fqh" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.577848 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8zw6r" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.599593 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.615779 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pwx9z"] Nov 22 03:10:33 crc kubenswrapper[4952]: W1122 03:10:33.654161 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20506ad4_9a8b_4a29_ae3b_35730fe5f2d0.slice/crio-234095c3bd176097cdd3079e07706d2a774b186333d87099ce61960e6617bb5d WatchSource:0}: Error finding container 234095c3bd176097cdd3079e07706d2a774b186333d87099ce61960e6617bb5d: Status 404 returned error can't find the container with id 234095c3bd176097cdd3079e07706d2a774b186333d87099ce61960e6617bb5d Nov 22 03:10:33 crc kubenswrapper[4952]: I1122 03:10:33.679033 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-lw6pd"] Nov 22 03:10:33 crc kubenswrapper[4952]: W1122 03:10:33.768434 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68636d65_d61f_4a83_95e2_1ef07aa6c362.slice/crio-9d81a9dc4d94356e34b1990e219725cf3563fadfc5f74878f50f460315a26d63 WatchSource:0}: Error finding container 9d81a9dc4d94356e34b1990e219725cf3563fadfc5f74878f50f460315a26d63: Status 404 returned error can't find the container with id 9d81a9dc4d94356e34b1990e219725cf3563fadfc5f74878f50f460315a26d63 Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.012383 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-msxp9"] Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.243131 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:10:34 crc kubenswrapper[4952]: W1122 03:10:34.250995 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod728bb230_a89d_4ef1_ae63_1fe2ad7e7589.slice/crio-70ee7d5a51e27df4d2d75b5e29f9e0ec29393fb20974454ae3e21bf0ee82bc63 WatchSource:0}: Error finding container 70ee7d5a51e27df4d2d75b5e29f9e0ec29393fb20974454ae3e21bf0ee82bc63: Status 404 returned error can't find the container with id 70ee7d5a51e27df4d2d75b5e29f9e0ec29393fb20974454ae3e21bf0ee82bc63 Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.271810 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9bjtf"] Nov 22 03:10:34 crc kubenswrapper[4952]: W1122 03:10:34.286625 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89e1e7cf_32af_45ef_b1bd_36fb741a1ffb.slice/crio-24faa7ecc3b85e5ca3834f8dacf31ccd9a296d80bf5f6c3b87034cbfde6ae25a WatchSource:0}: Error finding container 24faa7ecc3b85e5ca3834f8dacf31ccd9a296d80bf5f6c3b87034cbfde6ae25a: Status 404 returned error can't find the container with id 24faa7ecc3b85e5ca3834f8dacf31ccd9a296d80bf5f6c3b87034cbfde6ae25a Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.337089 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-msxp9" event={"ID":"dd05df7b-eac0-4a1e-b957-506c8a4c56c4","Type":"ContainerStarted","Data":"3eb45cae9e6d8630e3fdefc9ab99e45325fa0afae3598c9380fa313f6a207d80"} Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.339127 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9bjtf" event={"ID":"89e1e7cf-32af-45ef-b1bd-36fb741a1ffb","Type":"ContainerStarted","Data":"24faa7ecc3b85e5ca3834f8dacf31ccd9a296d80bf5f6c3b87034cbfde6ae25a"} Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.351037 4952 generic.go:334] "Generic (PLEG): container finished" podID="68636d65-d61f-4a83-95e2-1ef07aa6c362" containerID="e07bc94a3c5850ce14050e639533997fd5bf3caedb140d081047e57eb863277d" exitCode=0 Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.351242 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" event={"ID":"68636d65-d61f-4a83-95e2-1ef07aa6c362","Type":"ContainerDied","Data":"e07bc94a3c5850ce14050e639533997fd5bf3caedb140d081047e57eb863277d"} Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.351277 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" event={"ID":"68636d65-d61f-4a83-95e2-1ef07aa6c362","Type":"ContainerStarted","Data":"9d81a9dc4d94356e34b1990e219725cf3563fadfc5f74878f50f460315a26d63"} Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.357188 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"728bb230-a89d-4ef1-ae63-1fe2ad7e7589","Type":"ContainerStarted","Data":"70ee7d5a51e27df4d2d75b5e29f9e0ec29393fb20974454ae3e21bf0ee82bc63"} Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.371261 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pwx9z" event={"ID":"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0","Type":"ContainerStarted","Data":"c5e0d1a64bde42e0b6c42e227a939d9e166c659c5146d1fcf9c1c30594d9aaeb"} Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.371386 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pwx9z" event={"ID":"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0","Type":"ContainerStarted","Data":"234095c3bd176097cdd3079e07706d2a774b186333d87099ce61960e6617bb5d"} Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.398040 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-s7fqh"] Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.441987 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8zw6r"] Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.449732 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-g7scg"] Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.451334 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-pwx9z" podStartSLOduration=2.451309268 podStartE2EDuration="2.451309268s" podCreationTimestamp="2025-11-22 03:10:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:10:34.424634579 +0000 UTC m=+998.730651862" watchObservedRunningTime="2025-11-22 03:10:34.451309268 +0000 UTC m=+998.757326541" Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.671485 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.815823 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-config\") pod \"68636d65-d61f-4a83-95e2-1ef07aa6c362\" (UID: \"68636d65-d61f-4a83-95e2-1ef07aa6c362\") " Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.815967 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-ovsdbserver-sb\") pod \"68636d65-d61f-4a83-95e2-1ef07aa6c362\" (UID: \"68636d65-d61f-4a83-95e2-1ef07aa6c362\") " Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.816024 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-ovsdbserver-nb\") pod \"68636d65-d61f-4a83-95e2-1ef07aa6c362\" (UID: \"68636d65-d61f-4a83-95e2-1ef07aa6c362\") " Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.816055 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdqdc\" (UniqueName: \"kubernetes.io/projected/68636d65-d61f-4a83-95e2-1ef07aa6c362-kube-api-access-wdqdc\") pod \"68636d65-d61f-4a83-95e2-1ef07aa6c362\" (UID: \"68636d65-d61f-4a83-95e2-1ef07aa6c362\") " Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.816113 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-dns-svc\") pod \"68636d65-d61f-4a83-95e2-1ef07aa6c362\" (UID: \"68636d65-d61f-4a83-95e2-1ef07aa6c362\") " Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.853215 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68636d65-d61f-4a83-95e2-1ef07aa6c362-kube-api-access-wdqdc" (OuterVolumeSpecName: "kube-api-access-wdqdc") pod "68636d65-d61f-4a83-95e2-1ef07aa6c362" (UID: "68636d65-d61f-4a83-95e2-1ef07aa6c362"). InnerVolumeSpecName "kube-api-access-wdqdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.873858 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "68636d65-d61f-4a83-95e2-1ef07aa6c362" (UID: "68636d65-d61f-4a83-95e2-1ef07aa6c362"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.875882 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-config" (OuterVolumeSpecName: "config") pod "68636d65-d61f-4a83-95e2-1ef07aa6c362" (UID: "68636d65-d61f-4a83-95e2-1ef07aa6c362"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.878148 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "68636d65-d61f-4a83-95e2-1ef07aa6c362" (UID: "68636d65-d61f-4a83-95e2-1ef07aa6c362"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.884845 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "68636d65-d61f-4a83-95e2-1ef07aa6c362" (UID: "68636d65-d61f-4a83-95e2-1ef07aa6c362"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.918254 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.918301 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.918314 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.918324 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdqdc\" (UniqueName: \"kubernetes.io/projected/68636d65-d61f-4a83-95e2-1ef07aa6c362-kube-api-access-wdqdc\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:34 crc kubenswrapper[4952]: I1122 03:10:34.918333 4952 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68636d65-d61f-4a83-95e2-1ef07aa6c362-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:35 crc kubenswrapper[4952]: I1122 03:10:35.390270 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:10:35 crc kubenswrapper[4952]: I1122 03:10:35.413008 4952 generic.go:334] "Generic (PLEG): container finished" podID="985ac08f-2330-454b-94ba-a78dd7f376e0" containerID="c1a683d4b7255ba13427566faf87dd6110cd3e3dc66f1270768b6d80c0018979" exitCode=0 Nov 22 03:10:35 crc kubenswrapper[4952]: I1122 03:10:35.413090 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" event={"ID":"985ac08f-2330-454b-94ba-a78dd7f376e0","Type":"ContainerDied","Data":"c1a683d4b7255ba13427566faf87dd6110cd3e3dc66f1270768b6d80c0018979"} Nov 22 03:10:35 crc kubenswrapper[4952]: I1122 03:10:35.413162 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" event={"ID":"985ac08f-2330-454b-94ba-a78dd7f376e0","Type":"ContainerStarted","Data":"99eef59d9d3629f47e6d23bd01e8a9d17fa8be30625cfa3b2ce4034d88a2c4c1"} Nov 22 03:10:35 crc kubenswrapper[4952]: I1122 03:10:35.420001 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s7fqh" event={"ID":"fe0628cb-3a91-4838-9eeb-dc8b9087969e","Type":"ContainerStarted","Data":"7cb73afdde52b7e2ea3701512e56b356eb9cb5d4622195fa75311cb420c50584"} Nov 22 03:10:35 crc kubenswrapper[4952]: I1122 03:10:35.426015 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9bjtf" event={"ID":"89e1e7cf-32af-45ef-b1bd-36fb741a1ffb","Type":"ContainerStarted","Data":"0dedce8b1945cf3f0fb295895c6b546e1875621fd07d650c41cae1e91903a4e5"} Nov 22 03:10:35 crc kubenswrapper[4952]: I1122 03:10:35.436238 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" event={"ID":"68636d65-d61f-4a83-95e2-1ef07aa6c362","Type":"ContainerDied","Data":"9d81a9dc4d94356e34b1990e219725cf3563fadfc5f74878f50f460315a26d63"} Nov 22 03:10:35 crc kubenswrapper[4952]: I1122 03:10:35.436310 4952 scope.go:117] "RemoveContainer" containerID="e07bc94a3c5850ce14050e639533997fd5bf3caedb140d081047e57eb863277d" Nov 22 03:10:35 crc kubenswrapper[4952]: I1122 03:10:35.436303 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-lw6pd" Nov 22 03:10:35 crc kubenswrapper[4952]: I1122 03:10:35.440426 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8zw6r" event={"ID":"90966d16-8b8d-461f-9bb9-827f0d8cd48b","Type":"ContainerStarted","Data":"443e925ee12353d211c95beef8d440b68cf50f13b51ed607f621c128df6047aa"} Nov 22 03:10:35 crc kubenswrapper[4952]: I1122 03:10:35.499047 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-9bjtf" podStartSLOduration=3.49901915 podStartE2EDuration="3.49901915s" podCreationTimestamp="2025-11-22 03:10:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:10:35.481641923 +0000 UTC m=+999.787659196" watchObservedRunningTime="2025-11-22 03:10:35.49901915 +0000 UTC m=+999.805036423" Nov 22 03:10:35 crc kubenswrapper[4952]: I1122 03:10:35.561209 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-lw6pd"] Nov 22 03:10:35 crc kubenswrapper[4952]: I1122 03:10:35.581002 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-lw6pd"] Nov 22 03:10:36 crc kubenswrapper[4952]: I1122 03:10:36.460348 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" event={"ID":"985ac08f-2330-454b-94ba-a78dd7f376e0","Type":"ContainerStarted","Data":"8c6641c19e1e848e50665cb44372ed562e652102ba996dbe1e4a4da722ab8057"} Nov 22 03:10:36 crc kubenswrapper[4952]: I1122 03:10:36.460449 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:10:36 crc kubenswrapper[4952]: I1122 03:10:36.486020 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" podStartSLOduration=4.485997757 podStartE2EDuration="4.485997757s" podCreationTimestamp="2025-11-22 03:10:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:10:36.484379054 +0000 UTC m=+1000.790396327" watchObservedRunningTime="2025-11-22 03:10:36.485997757 +0000 UTC m=+1000.792015050" Nov 22 03:10:36 crc kubenswrapper[4952]: I1122 03:10:36.547670 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68636d65-d61f-4a83-95e2-1ef07aa6c362" path="/var/lib/kubelet/pods/68636d65-d61f-4a83-95e2-1ef07aa6c362/volumes" Nov 22 03:10:40 crc kubenswrapper[4952]: I1122 03:10:40.525976 4952 generic.go:334] "Generic (PLEG): container finished" podID="20506ad4-9a8b-4a29-ae3b-35730fe5f2d0" containerID="c5e0d1a64bde42e0b6c42e227a939d9e166c659c5146d1fcf9c1c30594d9aaeb" exitCode=0 Nov 22 03:10:40 crc kubenswrapper[4952]: I1122 03:10:40.526083 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pwx9z" event={"ID":"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0","Type":"ContainerDied","Data":"c5e0d1a64bde42e0b6c42e227a939d9e166c659c5146d1fcf9c1c30594d9aaeb"} Nov 22 03:10:43 crc kubenswrapper[4952]: I1122 03:10:43.601709 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:10:43 crc kubenswrapper[4952]: I1122 03:10:43.667735 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-zxtzz"] Nov 22 03:10:43 crc kubenswrapper[4952]: I1122 03:10:43.668468 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" podUID="8fb3d24b-a353-43fa-bef7-a4441d643cad" containerName="dnsmasq-dns" containerID="cri-o://6cbe39e0a4721cdbd1831b381aeb5d0757e82baa6484579b3414079256470238" gracePeriod=10 Nov 22 03:10:44 crc kubenswrapper[4952]: I1122 03:10:44.748919 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:44 crc kubenswrapper[4952]: I1122 03:10:44.913434 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-combined-ca-bundle\") pod \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " Nov 22 03:10:44 crc kubenswrapper[4952]: I1122 03:10:44.914044 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-scripts\") pod \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " Nov 22 03:10:44 crc kubenswrapper[4952]: I1122 03:10:44.914179 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-credential-keys\") pod \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " Nov 22 03:10:44 crc kubenswrapper[4952]: I1122 03:10:44.914242 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrkwf\" (UniqueName: \"kubernetes.io/projected/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-kube-api-access-mrkwf\") pod \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " Nov 22 03:10:44 crc kubenswrapper[4952]: I1122 03:10:44.914362 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-fernet-keys\") pod \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " Nov 22 03:10:44 crc kubenswrapper[4952]: I1122 03:10:44.914387 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-config-data\") pod \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\" (UID: \"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0\") " Nov 22 03:10:44 crc kubenswrapper[4952]: I1122 03:10:44.922564 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-kube-api-access-mrkwf" (OuterVolumeSpecName: "kube-api-access-mrkwf") pod "20506ad4-9a8b-4a29-ae3b-35730fe5f2d0" (UID: "20506ad4-9a8b-4a29-ae3b-35730fe5f2d0"). InnerVolumeSpecName "kube-api-access-mrkwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:44 crc kubenswrapper[4952]: I1122 03:10:44.922735 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-scripts" (OuterVolumeSpecName: "scripts") pod "20506ad4-9a8b-4a29-ae3b-35730fe5f2d0" (UID: "20506ad4-9a8b-4a29-ae3b-35730fe5f2d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:44 crc kubenswrapper[4952]: I1122 03:10:44.924212 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "20506ad4-9a8b-4a29-ae3b-35730fe5f2d0" (UID: "20506ad4-9a8b-4a29-ae3b-35730fe5f2d0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:44 crc kubenswrapper[4952]: I1122 03:10:44.937168 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "20506ad4-9a8b-4a29-ae3b-35730fe5f2d0" (UID: "20506ad4-9a8b-4a29-ae3b-35730fe5f2d0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:44 crc kubenswrapper[4952]: I1122 03:10:44.946933 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-config-data" (OuterVolumeSpecName: "config-data") pod "20506ad4-9a8b-4a29-ae3b-35730fe5f2d0" (UID: "20506ad4-9a8b-4a29-ae3b-35730fe5f2d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:44 crc kubenswrapper[4952]: I1122 03:10:44.948210 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20506ad4-9a8b-4a29-ae3b-35730fe5f2d0" (UID: "20506ad4-9a8b-4a29-ae3b-35730fe5f2d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:45 crc kubenswrapper[4952]: I1122 03:10:45.017005 4952 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:45 crc kubenswrapper[4952]: I1122 03:10:45.017046 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrkwf\" (UniqueName: \"kubernetes.io/projected/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-kube-api-access-mrkwf\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:45 crc kubenswrapper[4952]: I1122 03:10:45.017061 4952 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:45 crc kubenswrapper[4952]: I1122 03:10:45.017070 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:45 crc kubenswrapper[4952]: I1122 03:10:45.017081 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:45 crc kubenswrapper[4952]: I1122 03:10:45.017091 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:45 crc kubenswrapper[4952]: I1122 03:10:45.594803 4952 generic.go:334] "Generic (PLEG): container finished" podID="8fb3d24b-a353-43fa-bef7-a4441d643cad" containerID="6cbe39e0a4721cdbd1831b381aeb5d0757e82baa6484579b3414079256470238" exitCode=0 Nov 22 03:10:45 crc kubenswrapper[4952]: I1122 03:10:45.595102 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" event={"ID":"8fb3d24b-a353-43fa-bef7-a4441d643cad","Type":"ContainerDied","Data":"6cbe39e0a4721cdbd1831b381aeb5d0757e82baa6484579b3414079256470238"} Nov 22 03:10:45 crc kubenswrapper[4952]: I1122 03:10:45.600146 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pwx9z" event={"ID":"20506ad4-9a8b-4a29-ae3b-35730fe5f2d0","Type":"ContainerDied","Data":"234095c3bd176097cdd3079e07706d2a774b186333d87099ce61960e6617bb5d"} Nov 22 03:10:45 crc kubenswrapper[4952]: I1122 03:10:45.600366 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="234095c3bd176097cdd3079e07706d2a774b186333d87099ce61960e6617bb5d" Nov 22 03:10:45 crc kubenswrapper[4952]: I1122 03:10:45.600202 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pwx9z" Nov 22 03:10:45 crc kubenswrapper[4952]: I1122 03:10:45.946834 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-pwx9z"] Nov 22 03:10:45 crc kubenswrapper[4952]: I1122 03:10:45.954738 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-pwx9z"] Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.059510 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-clwjv"] Nov 22 03:10:46 crc kubenswrapper[4952]: E1122 03:10:46.059997 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20506ad4-9a8b-4a29-ae3b-35730fe5f2d0" containerName="keystone-bootstrap" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.060014 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="20506ad4-9a8b-4a29-ae3b-35730fe5f2d0" containerName="keystone-bootstrap" Nov 22 03:10:46 crc kubenswrapper[4952]: E1122 03:10:46.060037 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68636d65-d61f-4a83-95e2-1ef07aa6c362" containerName="init" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.060044 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="68636d65-d61f-4a83-95e2-1ef07aa6c362" containerName="init" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.060243 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="68636d65-d61f-4a83-95e2-1ef07aa6c362" containerName="init" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.060265 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="20506ad4-9a8b-4a29-ae3b-35730fe5f2d0" containerName="keystone-bootstrap" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.060927 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.065617 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.065981 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.066157 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.066322 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nwhtr" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.069149 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.075425 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-clwjv"] Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.242178 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-config-data\") pod \"keystone-bootstrap-clwjv\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.242248 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-scripts\") pod \"keystone-bootstrap-clwjv\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.242313 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-combined-ca-bundle\") pod \"keystone-bootstrap-clwjv\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.242356 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-fernet-keys\") pod \"keystone-bootstrap-clwjv\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.242663 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-credential-keys\") pod \"keystone-bootstrap-clwjv\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.242687 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfk5s\" (UniqueName: \"kubernetes.io/projected/935b4905-14f3-4505-ba9c-225833a9bdb4-kube-api-access-jfk5s\") pod \"keystone-bootstrap-clwjv\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.344818 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-combined-ca-bundle\") pod \"keystone-bootstrap-clwjv\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.344894 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-fernet-keys\") pod \"keystone-bootstrap-clwjv\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.345010 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-credential-keys\") pod \"keystone-bootstrap-clwjv\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.345036 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfk5s\" (UniqueName: \"kubernetes.io/projected/935b4905-14f3-4505-ba9c-225833a9bdb4-kube-api-access-jfk5s\") pod \"keystone-bootstrap-clwjv\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.345087 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-config-data\") pod \"keystone-bootstrap-clwjv\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.345113 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-scripts\") pod \"keystone-bootstrap-clwjv\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.350399 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-credential-keys\") pod \"keystone-bootstrap-clwjv\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.350485 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-scripts\") pod \"keystone-bootstrap-clwjv\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.350701 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-fernet-keys\") pod \"keystone-bootstrap-clwjv\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.350905 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-config-data\") pod \"keystone-bootstrap-clwjv\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.352715 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-combined-ca-bundle\") pod \"keystone-bootstrap-clwjv\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.367706 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfk5s\" (UniqueName: \"kubernetes.io/projected/935b4905-14f3-4505-ba9c-225833a9bdb4-kube-api-access-jfk5s\") pod \"keystone-bootstrap-clwjv\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.392089 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:10:46 crc kubenswrapper[4952]: I1122 03:10:46.545170 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20506ad4-9a8b-4a29-ae3b-35730fe5f2d0" path="/var/lib/kubelet/pods/20506ad4-9a8b-4a29-ae3b-35730fe5f2d0/volumes" Nov 22 03:10:50 crc kubenswrapper[4952]: I1122 03:10:50.669156 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" podUID="8fb3d24b-a353-43fa-bef7-a4441d643cad" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: i/o timeout" Nov 22 03:10:55 crc kubenswrapper[4952]: E1122 03:10:55.376814 4952 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Nov 22 03:10:55 crc kubenswrapper[4952]: E1122 03:10:55.378013 4952 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5z67v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-8zw6r_openstack(90966d16-8b8d-461f-9bb9-827f0d8cd48b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 03:10:55 crc kubenswrapper[4952]: E1122 03:10:55.379153 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-8zw6r" podUID="90966d16-8b8d-461f-9bb9-827f0d8cd48b" Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.448038 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.557472 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-config\") pod \"8fb3d24b-a353-43fa-bef7-a4441d643cad\" (UID: \"8fb3d24b-a353-43fa-bef7-a4441d643cad\") " Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.558137 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-ovsdbserver-sb\") pod \"8fb3d24b-a353-43fa-bef7-a4441d643cad\" (UID: \"8fb3d24b-a353-43fa-bef7-a4441d643cad\") " Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.558309 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-dns-svc\") pod \"8fb3d24b-a353-43fa-bef7-a4441d643cad\" (UID: \"8fb3d24b-a353-43fa-bef7-a4441d643cad\") " Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.558365 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkkql\" (UniqueName: \"kubernetes.io/projected/8fb3d24b-a353-43fa-bef7-a4441d643cad-kube-api-access-xkkql\") pod \"8fb3d24b-a353-43fa-bef7-a4441d643cad\" (UID: \"8fb3d24b-a353-43fa-bef7-a4441d643cad\") " Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.558455 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-ovsdbserver-nb\") pod \"8fb3d24b-a353-43fa-bef7-a4441d643cad\" (UID: \"8fb3d24b-a353-43fa-bef7-a4441d643cad\") " Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.573105 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fb3d24b-a353-43fa-bef7-a4441d643cad-kube-api-access-xkkql" (OuterVolumeSpecName: "kube-api-access-xkkql") pod "8fb3d24b-a353-43fa-bef7-a4441d643cad" (UID: "8fb3d24b-a353-43fa-bef7-a4441d643cad"). InnerVolumeSpecName "kube-api-access-xkkql". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.604940 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8fb3d24b-a353-43fa-bef7-a4441d643cad" (UID: "8fb3d24b-a353-43fa-bef7-a4441d643cad"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.606147 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8fb3d24b-a353-43fa-bef7-a4441d643cad" (UID: "8fb3d24b-a353-43fa-bef7-a4441d643cad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.618084 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-config" (OuterVolumeSpecName: "config") pod "8fb3d24b-a353-43fa-bef7-a4441d643cad" (UID: "8fb3d24b-a353-43fa-bef7-a4441d643cad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.625199 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8fb3d24b-a353-43fa-bef7-a4441d643cad" (UID: "8fb3d24b-a353-43fa-bef7-a4441d643cad"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.661449 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.661495 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.661510 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.661520 4952 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8fb3d24b-a353-43fa-bef7-a4441d643cad-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.661531 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkkql\" (UniqueName: \"kubernetes.io/projected/8fb3d24b-a353-43fa-bef7-a4441d643cad-kube-api-access-xkkql\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.669525 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" podUID="8fb3d24b-a353-43fa-bef7-a4441d643cad" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: i/o timeout" Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.719412 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" event={"ID":"8fb3d24b-a353-43fa-bef7-a4441d643cad","Type":"ContainerDied","Data":"160e9231174c7a523bfd10d782562c06b1bcf20ad53d77c17f8fc30ad20d19d7"} Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.719491 4952 scope.go:117] "RemoveContainer" containerID="6cbe39e0a4721cdbd1831b381aeb5d0757e82baa6484579b3414079256470238" Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.719430 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-zxtzz" Nov 22 03:10:55 crc kubenswrapper[4952]: E1122 03:10:55.722036 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-8zw6r" podUID="90966d16-8b8d-461f-9bb9-827f0d8cd48b" Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.767304 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-zxtzz"] Nov 22 03:10:55 crc kubenswrapper[4952]: I1122 03:10:55.773131 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-zxtzz"] Nov 22 03:10:56 crc kubenswrapper[4952]: I1122 03:10:56.555760 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fb3d24b-a353-43fa-bef7-a4441d643cad" path="/var/lib/kubelet/pods/8fb3d24b-a353-43fa-bef7-a4441d643cad/volumes" Nov 22 03:10:56 crc kubenswrapper[4952]: I1122 03:10:56.848430 4952 scope.go:117] "RemoveContainer" containerID="8f4e6c21ffc821abc3ba28be91ea1ccb803c7a23bf66c9cc94c3feb6272f72e8" Nov 22 03:10:56 crc kubenswrapper[4952]: E1122 03:10:56.872518 4952 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Nov 22 03:10:56 crc kubenswrapper[4952]: E1122 03:10:56.872757 4952 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qcfsp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-msxp9_openstack(dd05df7b-eac0-4a1e-b957-506c8a4c56c4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 03:10:56 crc kubenswrapper[4952]: E1122 03:10:56.873923 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-msxp9" podUID="dd05df7b-eac0-4a1e-b957-506c8a4c56c4" Nov 22 03:10:57 crc kubenswrapper[4952]: I1122 03:10:57.343754 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-clwjv"] Nov 22 03:10:57 crc kubenswrapper[4952]: W1122 03:10:57.350618 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod935b4905_14f3_4505_ba9c_225833a9bdb4.slice/crio-bdfc2f4f82263169d5349236bf17e7f474e4b6319f77dd83cbef72e584990832 WatchSource:0}: Error finding container bdfc2f4f82263169d5349236bf17e7f474e4b6319f77dd83cbef72e584990832: Status 404 returned error can't find the container with id bdfc2f4f82263169d5349236bf17e7f474e4b6319f77dd83cbef72e584990832 Nov 22 03:10:57 crc kubenswrapper[4952]: I1122 03:10:57.357028 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 22 03:10:57 crc kubenswrapper[4952]: I1122 03:10:57.741824 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"728bb230-a89d-4ef1-ae63-1fe2ad7e7589","Type":"ContainerStarted","Data":"ae5ef5c57c3b9700f3cedb4f522c52a33dda611cb861f50f3859b14e47548034"} Nov 22 03:10:57 crc kubenswrapper[4952]: I1122 03:10:57.744928 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-clwjv" event={"ID":"935b4905-14f3-4505-ba9c-225833a9bdb4","Type":"ContainerStarted","Data":"14cfdf8ac845bacbb9f2774b3d5b5f835bfcf4f296d65eb2dc4e4b43a11410f1"} Nov 22 03:10:57 crc kubenswrapper[4952]: I1122 03:10:57.745002 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-clwjv" event={"ID":"935b4905-14f3-4505-ba9c-225833a9bdb4","Type":"ContainerStarted","Data":"bdfc2f4f82263169d5349236bf17e7f474e4b6319f77dd83cbef72e584990832"} Nov 22 03:10:57 crc kubenswrapper[4952]: I1122 03:10:57.746469 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s7fqh" event={"ID":"fe0628cb-3a91-4838-9eeb-dc8b9087969e","Type":"ContainerStarted","Data":"74143ebe0da99ad9e5b9d777f5bf17e6bc2205a641e56283d347e722d3a3aca0"} Nov 22 03:10:57 crc kubenswrapper[4952]: E1122 03:10:57.748806 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-msxp9" podUID="dd05df7b-eac0-4a1e-b957-506c8a4c56c4" Nov 22 03:10:57 crc kubenswrapper[4952]: I1122 03:10:57.772116 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-clwjv" podStartSLOduration=11.772093185 podStartE2EDuration="11.772093185s" podCreationTimestamp="2025-11-22 03:10:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:10:57.769042303 +0000 UTC m=+1022.075059586" watchObservedRunningTime="2025-11-22 03:10:57.772093185 +0000 UTC m=+1022.078110458" Nov 22 03:10:57 crc kubenswrapper[4952]: I1122 03:10:57.818312 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-s7fqh" podStartSLOduration=3.392362307 podStartE2EDuration="25.81828778s" podCreationTimestamp="2025-11-22 03:10:32 +0000 UTC" firstStartedPulling="2025-11-22 03:10:34.428084702 +0000 UTC m=+998.734101975" lastFinishedPulling="2025-11-22 03:10:56.854010185 +0000 UTC m=+1021.160027448" observedRunningTime="2025-11-22 03:10:57.81605822 +0000 UTC m=+1022.122075493" watchObservedRunningTime="2025-11-22 03:10:57.81828778 +0000 UTC m=+1022.124305063" Nov 22 03:10:58 crc kubenswrapper[4952]: I1122 03:10:58.759315 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"728bb230-a89d-4ef1-ae63-1fe2ad7e7589","Type":"ContainerStarted","Data":"5592e7714ee3fb15754c3ee78edd6f0103468868d1e7116d2e5ac7485cac64e0"} Nov 22 03:11:00 crc kubenswrapper[4952]: I1122 03:11:00.790101 4952 generic.go:334] "Generic (PLEG): container finished" podID="fe0628cb-3a91-4838-9eeb-dc8b9087969e" containerID="74143ebe0da99ad9e5b9d777f5bf17e6bc2205a641e56283d347e722d3a3aca0" exitCode=0 Nov 22 03:11:00 crc kubenswrapper[4952]: I1122 03:11:00.790189 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s7fqh" event={"ID":"fe0628cb-3a91-4838-9eeb-dc8b9087969e","Type":"ContainerDied","Data":"74143ebe0da99ad9e5b9d777f5bf17e6bc2205a641e56283d347e722d3a3aca0"} Nov 22 03:11:01 crc kubenswrapper[4952]: I1122 03:11:01.802524 4952 generic.go:334] "Generic (PLEG): container finished" podID="935b4905-14f3-4505-ba9c-225833a9bdb4" containerID="14cfdf8ac845bacbb9f2774b3d5b5f835bfcf4f296d65eb2dc4e4b43a11410f1" exitCode=0 Nov 22 03:11:01 crc kubenswrapper[4952]: I1122 03:11:01.802668 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-clwjv" event={"ID":"935b4905-14f3-4505-ba9c-225833a9bdb4","Type":"ContainerDied","Data":"14cfdf8ac845bacbb9f2774b3d5b5f835bfcf4f296d65eb2dc4e4b43a11410f1"} Nov 22 03:11:01 crc kubenswrapper[4952]: I1122 03:11:01.805265 4952 generic.go:334] "Generic (PLEG): container finished" podID="89e1e7cf-32af-45ef-b1bd-36fb741a1ffb" containerID="0dedce8b1945cf3f0fb295895c6b546e1875621fd07d650c41cae1e91903a4e5" exitCode=0 Nov 22 03:11:01 crc kubenswrapper[4952]: I1122 03:11:01.805483 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9bjtf" event={"ID":"89e1e7cf-32af-45ef-b1bd-36fb741a1ffb","Type":"ContainerDied","Data":"0dedce8b1945cf3f0fb295895c6b546e1875621fd07d650c41cae1e91903a4e5"} Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.449493 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.461489 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s7fqh" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.525870 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9bjtf" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.531107 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-scripts\") pod \"935b4905-14f3-4505-ba9c-225833a9bdb4\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.531199 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfk5s\" (UniqueName: \"kubernetes.io/projected/935b4905-14f3-4505-ba9c-225833a9bdb4-kube-api-access-jfk5s\") pod \"935b4905-14f3-4505-ba9c-225833a9bdb4\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.531295 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-credential-keys\") pod \"935b4905-14f3-4505-ba9c-225833a9bdb4\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.531340 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-combined-ca-bundle\") pod \"935b4905-14f3-4505-ba9c-225833a9bdb4\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.531391 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-config-data\") pod \"935b4905-14f3-4505-ba9c-225833a9bdb4\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.531450 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0628cb-3a91-4838-9eeb-dc8b9087969e-scripts\") pod \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\" (UID: \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\") " Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.531582 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0628cb-3a91-4838-9eeb-dc8b9087969e-config-data\") pod \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\" (UID: \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\") " Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.531656 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht4vj\" (UniqueName: \"kubernetes.io/projected/fe0628cb-3a91-4838-9eeb-dc8b9087969e-kube-api-access-ht4vj\") pod \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\" (UID: \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\") " Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.531733 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0628cb-3a91-4838-9eeb-dc8b9087969e-combined-ca-bundle\") pod \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\" (UID: \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\") " Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.531815 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-fernet-keys\") pod \"935b4905-14f3-4505-ba9c-225833a9bdb4\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.531859 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0628cb-3a91-4838-9eeb-dc8b9087969e-logs\") pod \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\" (UID: \"fe0628cb-3a91-4838-9eeb-dc8b9087969e\") " Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.555139 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe0628cb-3a91-4838-9eeb-dc8b9087969e-logs" (OuterVolumeSpecName: "logs") pod "fe0628cb-3a91-4838-9eeb-dc8b9087969e" (UID: "fe0628cb-3a91-4838-9eeb-dc8b9087969e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.570933 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-scripts" (OuterVolumeSpecName: "scripts") pod "935b4905-14f3-4505-ba9c-225833a9bdb4" (UID: "935b4905-14f3-4505-ba9c-225833a9bdb4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.571068 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0628cb-3a91-4838-9eeb-dc8b9087969e-scripts" (OuterVolumeSpecName: "scripts") pod "fe0628cb-3a91-4838-9eeb-dc8b9087969e" (UID: "fe0628cb-3a91-4838-9eeb-dc8b9087969e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.577189 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "935b4905-14f3-4505-ba9c-225833a9bdb4" (UID: "935b4905-14f3-4505-ba9c-225833a9bdb4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.581742 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "935b4905-14f3-4505-ba9c-225833a9bdb4" (UID: "935b4905-14f3-4505-ba9c-225833a9bdb4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.581949 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe0628cb-3a91-4838-9eeb-dc8b9087969e-kube-api-access-ht4vj" (OuterVolumeSpecName: "kube-api-access-ht4vj") pod "fe0628cb-3a91-4838-9eeb-dc8b9087969e" (UID: "fe0628cb-3a91-4838-9eeb-dc8b9087969e"). InnerVolumeSpecName "kube-api-access-ht4vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.636606 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/935b4905-14f3-4505-ba9c-225833a9bdb4-kube-api-access-jfk5s" (OuterVolumeSpecName: "kube-api-access-jfk5s") pod "935b4905-14f3-4505-ba9c-225833a9bdb4" (UID: "935b4905-14f3-4505-ba9c-225833a9bdb4"). InnerVolumeSpecName "kube-api-access-jfk5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.651425 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd2ck\" (UniqueName: \"kubernetes.io/projected/89e1e7cf-32af-45ef-b1bd-36fb741a1ffb-kube-api-access-wd2ck\") pod \"89e1e7cf-32af-45ef-b1bd-36fb741a1ffb\" (UID: \"89e1e7cf-32af-45ef-b1bd-36fb741a1ffb\") " Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.651487 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89e1e7cf-32af-45ef-b1bd-36fb741a1ffb-combined-ca-bundle\") pod \"89e1e7cf-32af-45ef-b1bd-36fb741a1ffb\" (UID: \"89e1e7cf-32af-45ef-b1bd-36fb741a1ffb\") " Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.651630 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89e1e7cf-32af-45ef-b1bd-36fb741a1ffb-config\") pod \"89e1e7cf-32af-45ef-b1bd-36fb741a1ffb\" (UID: \"89e1e7cf-32af-45ef-b1bd-36fb741a1ffb\") " Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.652151 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfk5s\" (UniqueName: \"kubernetes.io/projected/935b4905-14f3-4505-ba9c-225833a9bdb4-kube-api-access-jfk5s\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.652170 4952 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.652180 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0628cb-3a91-4838-9eeb-dc8b9087969e-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.652190 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht4vj\" (UniqueName: \"kubernetes.io/projected/fe0628cb-3a91-4838-9eeb-dc8b9087969e-kube-api-access-ht4vj\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.652200 4952 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.652212 4952 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0628cb-3a91-4838-9eeb-dc8b9087969e-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.652220 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:03 crc kubenswrapper[4952]: E1122 03:11:03.657889 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-combined-ca-bundle podName:935b4905-14f3-4505-ba9c-225833a9bdb4 nodeName:}" failed. No retries permitted until 2025-11-22 03:11:04.157862172 +0000 UTC m=+1028.463879445 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-combined-ca-bundle") pod "935b4905-14f3-4505-ba9c-225833a9bdb4" (UID: "935b4905-14f3-4505-ba9c-225833a9bdb4") : error deleting /var/lib/kubelet/pods/935b4905-14f3-4505-ba9c-225833a9bdb4/volume-subpaths: remove /var/lib/kubelet/pods/935b4905-14f3-4505-ba9c-225833a9bdb4/volume-subpaths: no such file or directory Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.670809 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0628cb-3a91-4838-9eeb-dc8b9087969e-config-data" (OuterVolumeSpecName: "config-data") pod "fe0628cb-3a91-4838-9eeb-dc8b9087969e" (UID: "fe0628cb-3a91-4838-9eeb-dc8b9087969e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.683880 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-config-data" (OuterVolumeSpecName: "config-data") pod "935b4905-14f3-4505-ba9c-225833a9bdb4" (UID: "935b4905-14f3-4505-ba9c-225833a9bdb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.686434 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89e1e7cf-32af-45ef-b1bd-36fb741a1ffb-kube-api-access-wd2ck" (OuterVolumeSpecName: "kube-api-access-wd2ck") pod "89e1e7cf-32af-45ef-b1bd-36fb741a1ffb" (UID: "89e1e7cf-32af-45ef-b1bd-36fb741a1ffb"). InnerVolumeSpecName "kube-api-access-wd2ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.696609 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0628cb-3a91-4838-9eeb-dc8b9087969e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe0628cb-3a91-4838-9eeb-dc8b9087969e" (UID: "fe0628cb-3a91-4838-9eeb-dc8b9087969e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.709140 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89e1e7cf-32af-45ef-b1bd-36fb741a1ffb-config" (OuterVolumeSpecName: "config") pod "89e1e7cf-32af-45ef-b1bd-36fb741a1ffb" (UID: "89e1e7cf-32af-45ef-b1bd-36fb741a1ffb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.730806 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89e1e7cf-32af-45ef-b1bd-36fb741a1ffb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89e1e7cf-32af-45ef-b1bd-36fb741a1ffb" (UID: "89e1e7cf-32af-45ef-b1bd-36fb741a1ffb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.773064 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.773103 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0628cb-3a91-4838-9eeb-dc8b9087969e-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.773113 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd2ck\" (UniqueName: \"kubernetes.io/projected/89e1e7cf-32af-45ef-b1bd-36fb741a1ffb-kube-api-access-wd2ck\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.773125 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89e1e7cf-32af-45ef-b1bd-36fb741a1ffb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.773135 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0628cb-3a91-4838-9eeb-dc8b9087969e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.773145 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/89e1e7cf-32af-45ef-b1bd-36fb741a1ffb-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.828740 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9bjtf" event={"ID":"89e1e7cf-32af-45ef-b1bd-36fb741a1ffb","Type":"ContainerDied","Data":"24faa7ecc3b85e5ca3834f8dacf31ccd9a296d80bf5f6c3b87034cbfde6ae25a"} Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.828789 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24faa7ecc3b85e5ca3834f8dacf31ccd9a296d80bf5f6c3b87034cbfde6ae25a" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.828866 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9bjtf" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.831424 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"728bb230-a89d-4ef1-ae63-1fe2ad7e7589","Type":"ContainerStarted","Data":"32d4ee73986ea1df18e9c5f5b66039615d6616f8a01a41d3eca8c848f180d3a5"} Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.833623 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-clwjv" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.833643 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-clwjv" event={"ID":"935b4905-14f3-4505-ba9c-225833a9bdb4","Type":"ContainerDied","Data":"bdfc2f4f82263169d5349236bf17e7f474e4b6319f77dd83cbef72e584990832"} Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.833694 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdfc2f4f82263169d5349236bf17e7f474e4b6319f77dd83cbef72e584990832" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.835343 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s7fqh" event={"ID":"fe0628cb-3a91-4838-9eeb-dc8b9087969e","Type":"ContainerDied","Data":"7cb73afdde52b7e2ea3701512e56b356eb9cb5d4622195fa75311cb420c50584"} Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.835372 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cb73afdde52b7e2ea3701512e56b356eb9cb5d4622195fa75311cb420c50584" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.835429 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s7fqh" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.925613 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-65f67fb964-vlq9s"] Nov 22 03:11:03 crc kubenswrapper[4952]: E1122 03:11:03.926018 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89e1e7cf-32af-45ef-b1bd-36fb741a1ffb" containerName="neutron-db-sync" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.926036 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="89e1e7cf-32af-45ef-b1bd-36fb741a1ffb" containerName="neutron-db-sync" Nov 22 03:11:03 crc kubenswrapper[4952]: E1122 03:11:03.926071 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="935b4905-14f3-4505-ba9c-225833a9bdb4" containerName="keystone-bootstrap" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.926080 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="935b4905-14f3-4505-ba9c-225833a9bdb4" containerName="keystone-bootstrap" Nov 22 03:11:03 crc kubenswrapper[4952]: E1122 03:11:03.926090 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0628cb-3a91-4838-9eeb-dc8b9087969e" containerName="placement-db-sync" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.926099 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0628cb-3a91-4838-9eeb-dc8b9087969e" containerName="placement-db-sync" Nov 22 03:11:03 crc kubenswrapper[4952]: E1122 03:11:03.926114 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fb3d24b-a353-43fa-bef7-a4441d643cad" containerName="init" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.926122 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb3d24b-a353-43fa-bef7-a4441d643cad" containerName="init" Nov 22 03:11:03 crc kubenswrapper[4952]: E1122 03:11:03.926136 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fb3d24b-a353-43fa-bef7-a4441d643cad" containerName="dnsmasq-dns" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.926142 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb3d24b-a353-43fa-bef7-a4441d643cad" containerName="dnsmasq-dns" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.926349 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0628cb-3a91-4838-9eeb-dc8b9087969e" containerName="placement-db-sync" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.926391 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="89e1e7cf-32af-45ef-b1bd-36fb741a1ffb" containerName="neutron-db-sync" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.926410 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fb3d24b-a353-43fa-bef7-a4441d643cad" containerName="dnsmasq-dns" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.926422 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="935b4905-14f3-4505-ba9c-225833a9bdb4" containerName="keystone-bootstrap" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.927217 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.933010 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.934873 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.945881 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-65f67fb964-vlq9s"] Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.976560 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-fernet-keys\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.976642 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2bd8\" (UniqueName: \"kubernetes.io/projected/28d23392-8bab-45b0-a64b-a440b2850703-kube-api-access-f2bd8\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.976676 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-internal-tls-certs\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.976702 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-credential-keys\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.976735 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-scripts\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.976752 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-config-data\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.976769 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-combined-ca-bundle\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:03 crc kubenswrapper[4952]: I1122 03:11:03.976799 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-public-tls-certs\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.078442 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2bd8\" (UniqueName: \"kubernetes.io/projected/28d23392-8bab-45b0-a64b-a440b2850703-kube-api-access-f2bd8\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.078508 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-internal-tls-certs\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.078544 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-credential-keys\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.078628 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-scripts\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.078646 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-config-data\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.078660 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-combined-ca-bundle\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.078695 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-public-tls-certs\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.078734 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-fernet-keys\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.086369 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-public-tls-certs\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.086491 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-fernet-keys\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.088904 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-scripts\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.089707 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-internal-tls-certs\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.091731 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-combined-ca-bundle\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.093149 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-credential-keys\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.093381 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d23392-8bab-45b0-a64b-a440b2850703-config-data\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.159005 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2bd8\" (UniqueName: \"kubernetes.io/projected/28d23392-8bab-45b0-a64b-a440b2850703-kube-api-access-f2bd8\") pod \"keystone-65f67fb964-vlq9s\" (UID: \"28d23392-8bab-45b0-a64b-a440b2850703\") " pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.171849 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-fbkhc"] Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.173614 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.181456 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-combined-ca-bundle\") pod \"935b4905-14f3-4505-ba9c-225833a9bdb4\" (UID: \"935b4905-14f3-4505-ba9c-225833a9bdb4\") " Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.188847 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-fbkhc"] Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.198254 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "935b4905-14f3-4505-ba9c-225833a9bdb4" (UID: "935b4905-14f3-4505-ba9c-225833a9bdb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.242680 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.282905 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-856965f45b-8d4qq"] Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.284111 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-fbkhc\" (UID: \"9a7f108a-a73c-40d7-8aa0-4050c7819915\") " pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.284165 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnbpt\" (UniqueName: \"kubernetes.io/projected/9a7f108a-a73c-40d7-8aa0-4050c7819915-kube-api-access-dnbpt\") pod \"dnsmasq-dns-7b946d459c-fbkhc\" (UID: \"9a7f108a-a73c-40d7-8aa0-4050c7819915\") " pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.284219 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-config\") pod \"dnsmasq-dns-7b946d459c-fbkhc\" (UID: \"9a7f108a-a73c-40d7-8aa0-4050c7819915\") " pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.284304 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-dns-svc\") pod \"dnsmasq-dns-7b946d459c-fbkhc\" (UID: \"9a7f108a-a73c-40d7-8aa0-4050c7819915\") " pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.284332 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-fbkhc\" (UID: \"9a7f108a-a73c-40d7-8aa0-4050c7819915\") " pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.284411 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/935b4905-14f3-4505-ba9c-225833a9bdb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.285521 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.293501 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.293763 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.293895 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.294075 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-rlg7g" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.319192 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-856965f45b-8d4qq"] Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.386090 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-fbkhc\" (UID: \"9a7f108a-a73c-40d7-8aa0-4050c7819915\") " pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.386168 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnbpt\" (UniqueName: \"kubernetes.io/projected/9a7f108a-a73c-40d7-8aa0-4050c7819915-kube-api-access-dnbpt\") pod \"dnsmasq-dns-7b946d459c-fbkhc\" (UID: \"9a7f108a-a73c-40d7-8aa0-4050c7819915\") " pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.386203 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-combined-ca-bundle\") pod \"neutron-856965f45b-8d4qq\" (UID: \"d3701508-1734-46b5-83f3-9f08f930b294\") " pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.386244 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw4sn\" (UniqueName: \"kubernetes.io/projected/d3701508-1734-46b5-83f3-9f08f930b294-kube-api-access-kw4sn\") pod \"neutron-856965f45b-8d4qq\" (UID: \"d3701508-1734-46b5-83f3-9f08f930b294\") " pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.386301 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-config\") pod \"dnsmasq-dns-7b946d459c-fbkhc\" (UID: \"9a7f108a-a73c-40d7-8aa0-4050c7819915\") " pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.386381 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-httpd-config\") pod \"neutron-856965f45b-8d4qq\" (UID: \"d3701508-1734-46b5-83f3-9f08f930b294\") " pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.386437 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-dns-svc\") pod \"dnsmasq-dns-7b946d459c-fbkhc\" (UID: \"9a7f108a-a73c-40d7-8aa0-4050c7819915\") " pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.386466 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-ovndb-tls-certs\") pod \"neutron-856965f45b-8d4qq\" (UID: \"d3701508-1734-46b5-83f3-9f08f930b294\") " pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.386491 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-fbkhc\" (UID: \"9a7f108a-a73c-40d7-8aa0-4050c7819915\") " pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.386580 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-config\") pod \"neutron-856965f45b-8d4qq\" (UID: \"d3701508-1734-46b5-83f3-9f08f930b294\") " pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.391014 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-config\") pod \"dnsmasq-dns-7b946d459c-fbkhc\" (UID: \"9a7f108a-a73c-40d7-8aa0-4050c7819915\") " pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.391398 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-fbkhc\" (UID: \"9a7f108a-a73c-40d7-8aa0-4050c7819915\") " pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.392286 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-dns-svc\") pod \"dnsmasq-dns-7b946d459c-fbkhc\" (UID: \"9a7f108a-a73c-40d7-8aa0-4050c7819915\") " pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.398379 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-fbkhc\" (UID: \"9a7f108a-a73c-40d7-8aa0-4050c7819915\") " pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.428665 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnbpt\" (UniqueName: \"kubernetes.io/projected/9a7f108a-a73c-40d7-8aa0-4050c7819915-kube-api-access-dnbpt\") pod \"dnsmasq-dns-7b946d459c-fbkhc\" (UID: \"9a7f108a-a73c-40d7-8aa0-4050c7819915\") " pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.488909 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-httpd-config\") pod \"neutron-856965f45b-8d4qq\" (UID: \"d3701508-1734-46b5-83f3-9f08f930b294\") " pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.488979 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-ovndb-tls-certs\") pod \"neutron-856965f45b-8d4qq\" (UID: \"d3701508-1734-46b5-83f3-9f08f930b294\") " pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.489047 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-config\") pod \"neutron-856965f45b-8d4qq\" (UID: \"d3701508-1734-46b5-83f3-9f08f930b294\") " pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.489092 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-combined-ca-bundle\") pod \"neutron-856965f45b-8d4qq\" (UID: \"d3701508-1734-46b5-83f3-9f08f930b294\") " pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.489118 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw4sn\" (UniqueName: \"kubernetes.io/projected/d3701508-1734-46b5-83f3-9f08f930b294-kube-api-access-kw4sn\") pod \"neutron-856965f45b-8d4qq\" (UID: \"d3701508-1734-46b5-83f3-9f08f930b294\") " pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.493927 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-httpd-config\") pod \"neutron-856965f45b-8d4qq\" (UID: \"d3701508-1734-46b5-83f3-9f08f930b294\") " pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.494041 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-config\") pod \"neutron-856965f45b-8d4qq\" (UID: \"d3701508-1734-46b5-83f3-9f08f930b294\") " pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.494958 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-ovndb-tls-certs\") pod \"neutron-856965f45b-8d4qq\" (UID: \"d3701508-1734-46b5-83f3-9f08f930b294\") " pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.498277 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-combined-ca-bundle\") pod \"neutron-856965f45b-8d4qq\" (UID: \"d3701508-1734-46b5-83f3-9f08f930b294\") " pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.511798 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.526447 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw4sn\" (UniqueName: \"kubernetes.io/projected/d3701508-1734-46b5-83f3-9f08f930b294-kube-api-access-kw4sn\") pod \"neutron-856965f45b-8d4qq\" (UID: \"d3701508-1734-46b5-83f3-9f08f930b294\") " pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.669271 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-64f5d57fc8-zqkpt"] Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.671140 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.674976 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.675049 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.675232 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-c6sc6" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.675280 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.675404 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.682444 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.700528 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d787583c-f83b-4ced-aee7-b3c9c22217a7-scripts\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.700625 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d787583c-f83b-4ced-aee7-b3c9c22217a7-combined-ca-bundle\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.700798 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d787583c-f83b-4ced-aee7-b3c9c22217a7-logs\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.700838 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8jjf\" (UniqueName: \"kubernetes.io/projected/d787583c-f83b-4ced-aee7-b3c9c22217a7-kube-api-access-s8jjf\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.700881 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d787583c-f83b-4ced-aee7-b3c9c22217a7-internal-tls-certs\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.700932 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d787583c-f83b-4ced-aee7-b3c9c22217a7-config-data\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.700956 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d787583c-f83b-4ced-aee7-b3c9c22217a7-public-tls-certs\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.713368 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-64f5d57fc8-zqkpt"] Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.802597 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d787583c-f83b-4ced-aee7-b3c9c22217a7-logs\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.802644 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8jjf\" (UniqueName: \"kubernetes.io/projected/d787583c-f83b-4ced-aee7-b3c9c22217a7-kube-api-access-s8jjf\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.802688 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d787583c-f83b-4ced-aee7-b3c9c22217a7-internal-tls-certs\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.802783 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d787583c-f83b-4ced-aee7-b3c9c22217a7-config-data\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.802806 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d787583c-f83b-4ced-aee7-b3c9c22217a7-public-tls-certs\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.802849 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d787583c-f83b-4ced-aee7-b3c9c22217a7-scripts\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.802900 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d787583c-f83b-4ced-aee7-b3c9c22217a7-combined-ca-bundle\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.803071 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d787583c-f83b-4ced-aee7-b3c9c22217a7-logs\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.814389 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d787583c-f83b-4ced-aee7-b3c9c22217a7-public-tls-certs\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.815037 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d787583c-f83b-4ced-aee7-b3c9c22217a7-internal-tls-certs\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.819171 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d787583c-f83b-4ced-aee7-b3c9c22217a7-scripts\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.820511 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d787583c-f83b-4ced-aee7-b3c9c22217a7-config-data\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.833948 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d787583c-f83b-4ced-aee7-b3c9c22217a7-combined-ca-bundle\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.834604 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8jjf\" (UniqueName: \"kubernetes.io/projected/d787583c-f83b-4ced-aee7-b3c9c22217a7-kube-api-access-s8jjf\") pod \"placement-64f5d57fc8-zqkpt\" (UID: \"d787583c-f83b-4ced-aee7-b3c9c22217a7\") " pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:04 crc kubenswrapper[4952]: I1122 03:11:04.835527 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-65f67fb964-vlq9s"] Nov 22 03:11:04 crc kubenswrapper[4952]: W1122 03:11:04.852729 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28d23392_8bab_45b0_a64b_a440b2850703.slice/crio-dd4661cb3ebaebad2b890250c1828b171a384797349b6b0911477842a209719a WatchSource:0}: Error finding container dd4661cb3ebaebad2b890250c1828b171a384797349b6b0911477842a209719a: Status 404 returned error can't find the container with id dd4661cb3ebaebad2b890250c1828b171a384797349b6b0911477842a209719a Nov 22 03:11:05 crc kubenswrapper[4952]: I1122 03:11:05.014933 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:05 crc kubenswrapper[4952]: I1122 03:11:05.165640 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-fbkhc"] Nov 22 03:11:05 crc kubenswrapper[4952]: I1122 03:11:05.546112 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-64f5d57fc8-zqkpt"] Nov 22 03:11:05 crc kubenswrapper[4952]: I1122 03:11:05.868498 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64f5d57fc8-zqkpt" event={"ID":"d787583c-f83b-4ced-aee7-b3c9c22217a7","Type":"ContainerStarted","Data":"1c0a66ed0453ef3e2cd7e419f9aa152bcdd16a19853f2430415ea66549950e75"} Nov 22 03:11:05 crc kubenswrapper[4952]: I1122 03:11:05.871153 4952 generic.go:334] "Generic (PLEG): container finished" podID="9a7f108a-a73c-40d7-8aa0-4050c7819915" containerID="6e7b184d50d8ea796698026b12dd584c56aecfeec5d780741a6a448de04556eb" exitCode=0 Nov 22 03:11:05 crc kubenswrapper[4952]: I1122 03:11:05.871209 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" event={"ID":"9a7f108a-a73c-40d7-8aa0-4050c7819915","Type":"ContainerDied","Data":"6e7b184d50d8ea796698026b12dd584c56aecfeec5d780741a6a448de04556eb"} Nov 22 03:11:05 crc kubenswrapper[4952]: I1122 03:11:05.871228 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" event={"ID":"9a7f108a-a73c-40d7-8aa0-4050c7819915","Type":"ContainerStarted","Data":"176792cbe0ff28c007c471f38f36b2dede1bd03ac35d39cae2a7f665438b4b0e"} Nov 22 03:11:05 crc kubenswrapper[4952]: I1122 03:11:05.877301 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-65f67fb964-vlq9s" event={"ID":"28d23392-8bab-45b0-a64b-a440b2850703","Type":"ContainerStarted","Data":"82d998f8200c6d22cac31950ea461e1ba56e2db0927e21fca655c5ffcf62df86"} Nov 22 03:11:05 crc kubenswrapper[4952]: I1122 03:11:05.877362 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-65f67fb964-vlq9s" event={"ID":"28d23392-8bab-45b0-a64b-a440b2850703","Type":"ContainerStarted","Data":"dd4661cb3ebaebad2b890250c1828b171a384797349b6b0911477842a209719a"} Nov 22 03:11:05 crc kubenswrapper[4952]: I1122 03:11:05.877800 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:05 crc kubenswrapper[4952]: I1122 03:11:05.911168 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-65f67fb964-vlq9s" podStartSLOduration=2.911145033 podStartE2EDuration="2.911145033s" podCreationTimestamp="2025-11-22 03:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:05.906529229 +0000 UTC m=+1030.212546502" watchObservedRunningTime="2025-11-22 03:11:05.911145033 +0000 UTC m=+1030.217162306" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.523999 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-85576cd755-lg8j8"] Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.526330 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.550119 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ktx9\" (UniqueName: \"kubernetes.io/projected/b5998a45-a2cf-4155-b31c-7c39c5768ec1-kube-api-access-5ktx9\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.550185 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5998a45-a2cf-4155-b31c-7c39c5768ec1-internal-tls-certs\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.550414 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b5998a45-a2cf-4155-b31c-7c39c5768ec1-httpd-config\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.550637 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5998a45-a2cf-4155-b31c-7c39c5768ec1-public-tls-certs\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.550720 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5998a45-a2cf-4155-b31c-7c39c5768ec1-combined-ca-bundle\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.550838 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b5998a45-a2cf-4155-b31c-7c39c5768ec1-config\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.550888 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5998a45-a2cf-4155-b31c-7c39c5768ec1-ovndb-tls-certs\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.579332 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.580105 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.586694 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-85576cd755-lg8j8"] Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.653076 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b5998a45-a2cf-4155-b31c-7c39c5768ec1-httpd-config\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.653463 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5998a45-a2cf-4155-b31c-7c39c5768ec1-public-tls-certs\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.653495 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5998a45-a2cf-4155-b31c-7c39c5768ec1-combined-ca-bundle\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.653525 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b5998a45-a2cf-4155-b31c-7c39c5768ec1-config\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.653559 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5998a45-a2cf-4155-b31c-7c39c5768ec1-ovndb-tls-certs\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.653678 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ktx9\" (UniqueName: \"kubernetes.io/projected/b5998a45-a2cf-4155-b31c-7c39c5768ec1-kube-api-access-5ktx9\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.653710 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5998a45-a2cf-4155-b31c-7c39c5768ec1-internal-tls-certs\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.660370 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b5998a45-a2cf-4155-b31c-7c39c5768ec1-config\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.660814 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5998a45-a2cf-4155-b31c-7c39c5768ec1-combined-ca-bundle\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.663249 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b5998a45-a2cf-4155-b31c-7c39c5768ec1-httpd-config\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.666268 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5998a45-a2cf-4155-b31c-7c39c5768ec1-internal-tls-certs\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.667832 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5998a45-a2cf-4155-b31c-7c39c5768ec1-ovndb-tls-certs\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.672709 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ktx9\" (UniqueName: \"kubernetes.io/projected/b5998a45-a2cf-4155-b31c-7c39c5768ec1-kube-api-access-5ktx9\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.679567 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5998a45-a2cf-4155-b31c-7c39c5768ec1-public-tls-certs\") pod \"neutron-85576cd755-lg8j8\" (UID: \"b5998a45-a2cf-4155-b31c-7c39c5768ec1\") " pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:06 crc kubenswrapper[4952]: I1122 03:11:06.921684 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:07 crc kubenswrapper[4952]: I1122 03:11:07.006139 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-856965f45b-8d4qq"] Nov 22 03:11:07 crc kubenswrapper[4952]: W1122 03:11:07.015947 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3701508_1734_46b5_83f3_9f08f930b294.slice/crio-065c40827d83ee6dc4c8bd823d97444fc3a165ea52aad31d330ca8672ed719d2 WatchSource:0}: Error finding container 065c40827d83ee6dc4c8bd823d97444fc3a165ea52aad31d330ca8672ed719d2: Status 404 returned error can't find the container with id 065c40827d83ee6dc4c8bd823d97444fc3a165ea52aad31d330ca8672ed719d2 Nov 22 03:11:07 crc kubenswrapper[4952]: I1122 03:11:07.584945 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-85576cd755-lg8j8"] Nov 22 03:11:07 crc kubenswrapper[4952]: I1122 03:11:07.904352 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-856965f45b-8d4qq" event={"ID":"d3701508-1734-46b5-83f3-9f08f930b294","Type":"ContainerStarted","Data":"de75c61bcfa98d6971fb7470dfcdbd2a151952e28b151207f88afbfd957018f8"} Nov 22 03:11:07 crc kubenswrapper[4952]: I1122 03:11:07.904903 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-856965f45b-8d4qq" event={"ID":"d3701508-1734-46b5-83f3-9f08f930b294","Type":"ContainerStarted","Data":"e9bd255dde565b42faf43b862902b2902ef36a02022be70b9acc3b15327ddb3f"} Nov 22 03:11:07 crc kubenswrapper[4952]: I1122 03:11:07.904918 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-856965f45b-8d4qq" event={"ID":"d3701508-1734-46b5-83f3-9f08f930b294","Type":"ContainerStarted","Data":"065c40827d83ee6dc4c8bd823d97444fc3a165ea52aad31d330ca8672ed719d2"} Nov 22 03:11:07 crc kubenswrapper[4952]: I1122 03:11:07.908122 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64f5d57fc8-zqkpt" event={"ID":"d787583c-f83b-4ced-aee7-b3c9c22217a7","Type":"ContainerStarted","Data":"bf23d116a87441b5fcdea6d56fb2e2718adb38bb7ee74162811b19c8242ded41"} Nov 22 03:11:07 crc kubenswrapper[4952]: I1122 03:11:07.908167 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64f5d57fc8-zqkpt" event={"ID":"d787583c-f83b-4ced-aee7-b3c9c22217a7","Type":"ContainerStarted","Data":"98493962af994e213a42e59b265bb8f7a59c3cb063d6a5e7cc5d185ab269fa27"} Nov 22 03:11:07 crc kubenswrapper[4952]: I1122 03:11:07.908400 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:07 crc kubenswrapper[4952]: I1122 03:11:07.908606 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:07 crc kubenswrapper[4952]: I1122 03:11:07.912852 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" event={"ID":"9a7f108a-a73c-40d7-8aa0-4050c7819915","Type":"ContainerStarted","Data":"409519a594a484c997f443e2b43a0045738b830d193110da9bd138c3ae5ca1aa"} Nov 22 03:11:07 crc kubenswrapper[4952]: I1122 03:11:07.912943 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:07 crc kubenswrapper[4952]: I1122 03:11:07.915283 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85576cd755-lg8j8" event={"ID":"b5998a45-a2cf-4155-b31c-7c39c5768ec1","Type":"ContainerStarted","Data":"578088b5ee6ba80de0f4fbb37c19335c450b01fdf4573798ec29365d8991b607"} Nov 22 03:11:07 crc kubenswrapper[4952]: I1122 03:11:07.936485 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-856965f45b-8d4qq" podStartSLOduration=3.93645144 podStartE2EDuration="3.93645144s" podCreationTimestamp="2025-11-22 03:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:07.928789453 +0000 UTC m=+1032.234806716" watchObservedRunningTime="2025-11-22 03:11:07.93645144 +0000 UTC m=+1032.242468713" Nov 22 03:11:07 crc kubenswrapper[4952]: I1122 03:11:07.969656 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-64f5d57fc8-zqkpt" podStartSLOduration=3.969627053 podStartE2EDuration="3.969627053s" podCreationTimestamp="2025-11-22 03:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:07.959895541 +0000 UTC m=+1032.265912834" watchObservedRunningTime="2025-11-22 03:11:07.969627053 +0000 UTC m=+1032.275644356" Nov 22 03:11:08 crc kubenswrapper[4952]: I1122 03:11:08.014436 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" podStartSLOduration=4.01440993 podStartE2EDuration="4.01440993s" podCreationTimestamp="2025-11-22 03:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:07.999652163 +0000 UTC m=+1032.305669436" watchObservedRunningTime="2025-11-22 03:11:08.01440993 +0000 UTC m=+1032.320427213" Nov 22 03:11:08 crc kubenswrapper[4952]: I1122 03:11:08.933932 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85576cd755-lg8j8" event={"ID":"b5998a45-a2cf-4155-b31c-7c39c5768ec1","Type":"ContainerStarted","Data":"3657462114f164860e0eb567f621c0d9c7a77ec1fae1c2d82dc8d8b40520f6af"} Nov 22 03:11:08 crc kubenswrapper[4952]: I1122 03:11:08.934995 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:08 crc kubenswrapper[4952]: I1122 03:11:08.935050 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85576cd755-lg8j8" event={"ID":"b5998a45-a2cf-4155-b31c-7c39c5768ec1","Type":"ContainerStarted","Data":"b107951212e46b9ae7b7160067690f3d48e3dca4346fa67ea07126af49e50140"} Nov 22 03:11:08 crc kubenswrapper[4952]: I1122 03:11:08.935081 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:08 crc kubenswrapper[4952]: I1122 03:11:08.957680 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-85576cd755-lg8j8" podStartSLOduration=2.957658819 podStartE2EDuration="2.957658819s" podCreationTimestamp="2025-11-22 03:11:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:08.954241097 +0000 UTC m=+1033.260258370" watchObservedRunningTime="2025-11-22 03:11:08.957658819 +0000 UTC m=+1033.263676092" Nov 22 03:11:14 crc kubenswrapper[4952]: I1122 03:11:14.514223 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:14 crc kubenswrapper[4952]: I1122 03:11:14.611742 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-g7scg"] Nov 22 03:11:14 crc kubenswrapper[4952]: I1122 03:11:14.613325 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" podUID="985ac08f-2330-454b-94ba-a78dd7f376e0" containerName="dnsmasq-dns" containerID="cri-o://8c6641c19e1e848e50665cb44372ed562e652102ba996dbe1e4a4da722ab8057" gracePeriod=10 Nov 22 03:11:15 crc kubenswrapper[4952]: I1122 03:11:15.011126 4952 generic.go:334] "Generic (PLEG): container finished" podID="985ac08f-2330-454b-94ba-a78dd7f376e0" containerID="8c6641c19e1e848e50665cb44372ed562e652102ba996dbe1e4a4da722ab8057" exitCode=0 Nov 22 03:11:15 crc kubenswrapper[4952]: I1122 03:11:15.011621 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" event={"ID":"985ac08f-2330-454b-94ba-a78dd7f376e0","Type":"ContainerDied","Data":"8c6641c19e1e848e50665cb44372ed562e652102ba996dbe1e4a4da722ab8057"} Nov 22 03:11:15 crc kubenswrapper[4952]: I1122 03:11:15.113125 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:11:15 crc kubenswrapper[4952]: I1122 03:11:15.253923 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbp5q\" (UniqueName: \"kubernetes.io/projected/985ac08f-2330-454b-94ba-a78dd7f376e0-kube-api-access-bbp5q\") pod \"985ac08f-2330-454b-94ba-a78dd7f376e0\" (UID: \"985ac08f-2330-454b-94ba-a78dd7f376e0\") " Nov 22 03:11:15 crc kubenswrapper[4952]: I1122 03:11:15.253989 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-dns-svc\") pod \"985ac08f-2330-454b-94ba-a78dd7f376e0\" (UID: \"985ac08f-2330-454b-94ba-a78dd7f376e0\") " Nov 22 03:11:15 crc kubenswrapper[4952]: I1122 03:11:15.254015 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-ovsdbserver-sb\") pod \"985ac08f-2330-454b-94ba-a78dd7f376e0\" (UID: \"985ac08f-2330-454b-94ba-a78dd7f376e0\") " Nov 22 03:11:15 crc kubenswrapper[4952]: I1122 03:11:15.254225 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-config\") pod \"985ac08f-2330-454b-94ba-a78dd7f376e0\" (UID: \"985ac08f-2330-454b-94ba-a78dd7f376e0\") " Nov 22 03:11:15 crc kubenswrapper[4952]: I1122 03:11:15.254256 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-ovsdbserver-nb\") pod \"985ac08f-2330-454b-94ba-a78dd7f376e0\" (UID: \"985ac08f-2330-454b-94ba-a78dd7f376e0\") " Nov 22 03:11:15 crc kubenswrapper[4952]: I1122 03:11:15.262995 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/985ac08f-2330-454b-94ba-a78dd7f376e0-kube-api-access-bbp5q" (OuterVolumeSpecName: "kube-api-access-bbp5q") pod "985ac08f-2330-454b-94ba-a78dd7f376e0" (UID: "985ac08f-2330-454b-94ba-a78dd7f376e0"). InnerVolumeSpecName "kube-api-access-bbp5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:15 crc kubenswrapper[4952]: I1122 03:11:15.301524 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "985ac08f-2330-454b-94ba-a78dd7f376e0" (UID: "985ac08f-2330-454b-94ba-a78dd7f376e0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:15 crc kubenswrapper[4952]: I1122 03:11:15.313343 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-config" (OuterVolumeSpecName: "config") pod "985ac08f-2330-454b-94ba-a78dd7f376e0" (UID: "985ac08f-2330-454b-94ba-a78dd7f376e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:15 crc kubenswrapper[4952]: I1122 03:11:15.314510 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "985ac08f-2330-454b-94ba-a78dd7f376e0" (UID: "985ac08f-2330-454b-94ba-a78dd7f376e0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:15 crc kubenswrapper[4952]: I1122 03:11:15.319146 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "985ac08f-2330-454b-94ba-a78dd7f376e0" (UID: "985ac08f-2330-454b-94ba-a78dd7f376e0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:15 crc kubenswrapper[4952]: I1122 03:11:15.356347 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:15 crc kubenswrapper[4952]: I1122 03:11:15.356386 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:15 crc kubenswrapper[4952]: I1122 03:11:15.356400 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbp5q\" (UniqueName: \"kubernetes.io/projected/985ac08f-2330-454b-94ba-a78dd7f376e0-kube-api-access-bbp5q\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:15 crc kubenswrapper[4952]: I1122 03:11:15.356409 4952 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:15 crc kubenswrapper[4952]: I1122 03:11:15.356416 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/985ac08f-2330-454b-94ba-a78dd7f376e0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:16 crc kubenswrapper[4952]: I1122 03:11:16.023354 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8zw6r" event={"ID":"90966d16-8b8d-461f-9bb9-827f0d8cd48b","Type":"ContainerStarted","Data":"a3f97f63c0bbb317d9967cdbf48ee6ca2db94efda45a33eb259fa189b6088438"} Nov 22 03:11:16 crc kubenswrapper[4952]: I1122 03:11:16.027530 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"728bb230-a89d-4ef1-ae63-1fe2ad7e7589","Type":"ContainerStarted","Data":"74442840faf39cce1ded85d7593adba19c33546f5c89620045fa931315ad44b0"} Nov 22 03:11:16 crc kubenswrapper[4952]: I1122 03:11:16.027651 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 03:11:16 crc kubenswrapper[4952]: I1122 03:11:16.027642 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" containerName="ceilometer-central-agent" containerID="cri-o://ae5ef5c57c3b9700f3cedb4f522c52a33dda611cb861f50f3859b14e47548034" gracePeriod=30 Nov 22 03:11:16 crc kubenswrapper[4952]: I1122 03:11:16.027673 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" containerName="sg-core" containerID="cri-o://32d4ee73986ea1df18e9c5f5b66039615d6616f8a01a41d3eca8c848f180d3a5" gracePeriod=30 Nov 22 03:11:16 crc kubenswrapper[4952]: I1122 03:11:16.027719 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" containerName="proxy-httpd" containerID="cri-o://74442840faf39cce1ded85d7593adba19c33546f5c89620045fa931315ad44b0" gracePeriod=30 Nov 22 03:11:16 crc kubenswrapper[4952]: I1122 03:11:16.027758 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" containerName="ceilometer-notification-agent" containerID="cri-o://5592e7714ee3fb15754c3ee78edd6f0103468868d1e7116d2e5ac7485cac64e0" gracePeriod=30 Nov 22 03:11:16 crc kubenswrapper[4952]: I1122 03:11:16.031132 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" event={"ID":"985ac08f-2330-454b-94ba-a78dd7f376e0","Type":"ContainerDied","Data":"99eef59d9d3629f47e6d23bd01e8a9d17fa8be30625cfa3b2ce4034d88a2c4c1"} Nov 22 03:11:16 crc kubenswrapper[4952]: I1122 03:11:16.031206 4952 scope.go:117] "RemoveContainer" containerID="8c6641c19e1e848e50665cb44372ed562e652102ba996dbe1e4a4da722ab8057" Nov 22 03:11:16 crc kubenswrapper[4952]: I1122 03:11:16.031420 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-g7scg" Nov 22 03:11:16 crc kubenswrapper[4952]: I1122 03:11:16.034757 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-msxp9" event={"ID":"dd05df7b-eac0-4a1e-b957-506c8a4c56c4","Type":"ContainerStarted","Data":"df095af4aaa28c2ce1bb66ab2fe7d4ee4861df62d886e6aa8ea343e5ff18893c"} Nov 22 03:11:16 crc kubenswrapper[4952]: I1122 03:11:16.054237 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-8zw6r" podStartSLOduration=3.756283003 podStartE2EDuration="44.054197843s" podCreationTimestamp="2025-11-22 03:10:32 +0000 UTC" firstStartedPulling="2025-11-22 03:10:34.434018612 +0000 UTC m=+998.740035885" lastFinishedPulling="2025-11-22 03:11:14.731933452 +0000 UTC m=+1039.037950725" observedRunningTime="2025-11-22 03:11:16.047091842 +0000 UTC m=+1040.353109115" watchObservedRunningTime="2025-11-22 03:11:16.054197843 +0000 UTC m=+1040.360215166" Nov 22 03:11:16 crc kubenswrapper[4952]: I1122 03:11:16.064978 4952 scope.go:117] "RemoveContainer" containerID="c1a683d4b7255ba13427566faf87dd6110cd3e3dc66f1270768b6d80c0018979" Nov 22 03:11:16 crc kubenswrapper[4952]: I1122 03:11:16.081224 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.626927759 podStartE2EDuration="44.081197341s" podCreationTimestamp="2025-11-22 03:10:32 +0000 UTC" firstStartedPulling="2025-11-22 03:10:34.279605191 +0000 UTC m=+998.585622464" lastFinishedPulling="2025-11-22 03:11:14.733874773 +0000 UTC m=+1039.039892046" observedRunningTime="2025-11-22 03:11:16.076113114 +0000 UTC m=+1040.382130387" watchObservedRunningTime="2025-11-22 03:11:16.081197341 +0000 UTC m=+1040.387214624" Nov 22 03:11:16 crc kubenswrapper[4952]: I1122 03:11:16.120343 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-msxp9" podStartSLOduration=3.409749406 podStartE2EDuration="44.120319056s" podCreationTimestamp="2025-11-22 03:10:32 +0000 UTC" firstStartedPulling="2025-11-22 03:10:34.021839104 +0000 UTC m=+998.327856377" lastFinishedPulling="2025-11-22 03:11:14.732408764 +0000 UTC m=+1039.038426027" observedRunningTime="2025-11-22 03:11:16.102939577 +0000 UTC m=+1040.408956840" watchObservedRunningTime="2025-11-22 03:11:16.120319056 +0000 UTC m=+1040.426336329" Nov 22 03:11:16 crc kubenswrapper[4952]: I1122 03:11:16.137863 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-g7scg"] Nov 22 03:11:16 crc kubenswrapper[4952]: I1122 03:11:16.146693 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-g7scg"] Nov 22 03:11:16 crc kubenswrapper[4952]: I1122 03:11:16.545297 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="985ac08f-2330-454b-94ba-a78dd7f376e0" path="/var/lib/kubelet/pods/985ac08f-2330-454b-94ba-a78dd7f376e0/volumes" Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.059072 4952 generic.go:334] "Generic (PLEG): container finished" podID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" containerID="74442840faf39cce1ded85d7593adba19c33546f5c89620045fa931315ad44b0" exitCode=0 Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.059158 4952 generic.go:334] "Generic (PLEG): container finished" podID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" containerID="32d4ee73986ea1df18e9c5f5b66039615d6616f8a01a41d3eca8c848f180d3a5" exitCode=2 Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.059181 4952 generic.go:334] "Generic (PLEG): container finished" podID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" containerID="5592e7714ee3fb15754c3ee78edd6f0103468868d1e7116d2e5ac7485cac64e0" exitCode=0 Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.059206 4952 generic.go:334] "Generic (PLEG): container finished" podID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" containerID="ae5ef5c57c3b9700f3cedb4f522c52a33dda611cb861f50f3859b14e47548034" exitCode=0 Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.059322 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"728bb230-a89d-4ef1-ae63-1fe2ad7e7589","Type":"ContainerDied","Data":"74442840faf39cce1ded85d7593adba19c33546f5c89620045fa931315ad44b0"} Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.059377 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"728bb230-a89d-4ef1-ae63-1fe2ad7e7589","Type":"ContainerDied","Data":"32d4ee73986ea1df18e9c5f5b66039615d6616f8a01a41d3eca8c848f180d3a5"} Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.059405 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"728bb230-a89d-4ef1-ae63-1fe2ad7e7589","Type":"ContainerDied","Data":"5592e7714ee3fb15754c3ee78edd6f0103468868d1e7116d2e5ac7485cac64e0"} Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.059427 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"728bb230-a89d-4ef1-ae63-1fe2ad7e7589","Type":"ContainerDied","Data":"ae5ef5c57c3b9700f3cedb4f522c52a33dda611cb861f50f3859b14e47548034"} Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.408375 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.506059 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt46v\" (UniqueName: \"kubernetes.io/projected/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-kube-api-access-tt46v\") pod \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.506187 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-combined-ca-bundle\") pod \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.506280 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-run-httpd\") pod \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.506359 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-config-data\") pod \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.506386 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-scripts\") pod \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.506404 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-sg-core-conf-yaml\") pod \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.506422 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-log-httpd\") pod \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\" (UID: \"728bb230-a89d-4ef1-ae63-1fe2ad7e7589\") " Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.506916 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "728bb230-a89d-4ef1-ae63-1fe2ad7e7589" (UID: "728bb230-a89d-4ef1-ae63-1fe2ad7e7589"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.507454 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "728bb230-a89d-4ef1-ae63-1fe2ad7e7589" (UID: "728bb230-a89d-4ef1-ae63-1fe2ad7e7589"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.515284 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-kube-api-access-tt46v" (OuterVolumeSpecName: "kube-api-access-tt46v") pod "728bb230-a89d-4ef1-ae63-1fe2ad7e7589" (UID: "728bb230-a89d-4ef1-ae63-1fe2ad7e7589"). InnerVolumeSpecName "kube-api-access-tt46v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.518446 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-scripts" (OuterVolumeSpecName: "scripts") pod "728bb230-a89d-4ef1-ae63-1fe2ad7e7589" (UID: "728bb230-a89d-4ef1-ae63-1fe2ad7e7589"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.543328 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "728bb230-a89d-4ef1-ae63-1fe2ad7e7589" (UID: "728bb230-a89d-4ef1-ae63-1fe2ad7e7589"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.587699 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "728bb230-a89d-4ef1-ae63-1fe2ad7e7589" (UID: "728bb230-a89d-4ef1-ae63-1fe2ad7e7589"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.608823 4952 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.609144 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.609239 4952 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.609319 4952 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.609396 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt46v\" (UniqueName: \"kubernetes.io/projected/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-kube-api-access-tt46v\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.609483 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.612779 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-config-data" (OuterVolumeSpecName: "config-data") pod "728bb230-a89d-4ef1-ae63-1fe2ad7e7589" (UID: "728bb230-a89d-4ef1-ae63-1fe2ad7e7589"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:17 crc kubenswrapper[4952]: I1122 03:11:17.711964 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728bb230-a89d-4ef1-ae63-1fe2ad7e7589-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.077278 4952 generic.go:334] "Generic (PLEG): container finished" podID="90966d16-8b8d-461f-9bb9-827f0d8cd48b" containerID="a3f97f63c0bbb317d9967cdbf48ee6ca2db94efda45a33eb259fa189b6088438" exitCode=0 Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.077409 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8zw6r" event={"ID":"90966d16-8b8d-461f-9bb9-827f0d8cd48b","Type":"ContainerDied","Data":"a3f97f63c0bbb317d9967cdbf48ee6ca2db94efda45a33eb259fa189b6088438"} Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.082829 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"728bb230-a89d-4ef1-ae63-1fe2ad7e7589","Type":"ContainerDied","Data":"70ee7d5a51e27df4d2d75b5e29f9e0ec29393fb20974454ae3e21bf0ee82bc63"} Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.082915 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.082920 4952 scope.go:117] "RemoveContainer" containerID="74442840faf39cce1ded85d7593adba19c33546f5c89620045fa931315ad44b0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.176958 4952 scope.go:117] "RemoveContainer" containerID="32d4ee73986ea1df18e9c5f5b66039615d6616f8a01a41d3eca8c848f180d3a5" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.186368 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.198036 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.206511 4952 scope.go:117] "RemoveContainer" containerID="5592e7714ee3fb15754c3ee78edd6f0103468868d1e7116d2e5ac7485cac64e0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.221190 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:18 crc kubenswrapper[4952]: E1122 03:11:18.221701 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" containerName="sg-core" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.221721 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" containerName="sg-core" Nov 22 03:11:18 crc kubenswrapper[4952]: E1122 03:11:18.221763 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" containerName="ceilometer-notification-agent" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.221770 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" containerName="ceilometer-notification-agent" Nov 22 03:11:18 crc kubenswrapper[4952]: E1122 03:11:18.221784 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" containerName="proxy-httpd" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.221791 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" containerName="proxy-httpd" Nov 22 03:11:18 crc kubenswrapper[4952]: E1122 03:11:18.221803 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="985ac08f-2330-454b-94ba-a78dd7f376e0" containerName="dnsmasq-dns" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.221810 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="985ac08f-2330-454b-94ba-a78dd7f376e0" containerName="dnsmasq-dns" Nov 22 03:11:18 crc kubenswrapper[4952]: E1122 03:11:18.221826 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" containerName="ceilometer-central-agent" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.221833 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" containerName="ceilometer-central-agent" Nov 22 03:11:18 crc kubenswrapper[4952]: E1122 03:11:18.221847 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="985ac08f-2330-454b-94ba-a78dd7f376e0" containerName="init" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.221853 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="985ac08f-2330-454b-94ba-a78dd7f376e0" containerName="init" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.222334 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="985ac08f-2330-454b-94ba-a78dd7f376e0" containerName="dnsmasq-dns" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.222371 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" containerName="sg-core" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.222390 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" containerName="ceilometer-central-agent" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.222405 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" containerName="ceilometer-notification-agent" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.222422 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" containerName="proxy-httpd" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.224108 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.230244 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.230758 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.245398 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.260774 4952 scope.go:117] "RemoveContainer" containerID="ae5ef5c57c3b9700f3cedb4f522c52a33dda611cb861f50f3859b14e47548034" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.323211 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twctv\" (UniqueName: \"kubernetes.io/projected/b90cc371-f277-4e5c-9e95-5b8233d75503-kube-api-access-twctv\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.323832 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.323898 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b90cc371-f277-4e5c-9e95-5b8233d75503-run-httpd\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.324058 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.324107 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-scripts\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.324298 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b90cc371-f277-4e5c-9e95-5b8233d75503-log-httpd\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.324347 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-config-data\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.426722 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twctv\" (UniqueName: \"kubernetes.io/projected/b90cc371-f277-4e5c-9e95-5b8233d75503-kube-api-access-twctv\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.426805 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.428450 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b90cc371-f277-4e5c-9e95-5b8233d75503-run-httpd\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.428619 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.428686 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-scripts\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.429412 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b90cc371-f277-4e5c-9e95-5b8233d75503-log-httpd\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.429470 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-config-data\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.430032 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b90cc371-f277-4e5c-9e95-5b8233d75503-run-httpd\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.430296 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b90cc371-f277-4e5c-9e95-5b8233d75503-log-httpd\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.435696 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-scripts\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.442121 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-config-data\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.445328 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.459939 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.459992 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twctv\" (UniqueName: \"kubernetes.io/projected/b90cc371-f277-4e5c-9e95-5b8233d75503-kube-api-access-twctv\") pod \"ceilometer-0\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " pod="openstack/ceilometer-0" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.542313 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="728bb230-a89d-4ef1-ae63-1fe2ad7e7589" path="/var/lib/kubelet/pods/728bb230-a89d-4ef1-ae63-1fe2ad7e7589/volumes" Nov 22 03:11:18 crc kubenswrapper[4952]: I1122 03:11:18.548824 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:11:19 crc kubenswrapper[4952]: I1122 03:11:19.684372 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:19 crc kubenswrapper[4952]: I1122 03:11:19.688378 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8zw6r" Nov 22 03:11:19 crc kubenswrapper[4952]: W1122 03:11:19.689819 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb90cc371_f277_4e5c_9e95_5b8233d75503.slice/crio-3188a7d9e90a3aa3411929beeb33e7c08c31a29bb3ddf556d15cc57bb0d96696 WatchSource:0}: Error finding container 3188a7d9e90a3aa3411929beeb33e7c08c31a29bb3ddf556d15cc57bb0d96696: Status 404 returned error can't find the container with id 3188a7d9e90a3aa3411929beeb33e7c08c31a29bb3ddf556d15cc57bb0d96696 Nov 22 03:11:19 crc kubenswrapper[4952]: I1122 03:11:19.755272 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/90966d16-8b8d-461f-9bb9-827f0d8cd48b-db-sync-config-data\") pod \"90966d16-8b8d-461f-9bb9-827f0d8cd48b\" (UID: \"90966d16-8b8d-461f-9bb9-827f0d8cd48b\") " Nov 22 03:11:19 crc kubenswrapper[4952]: I1122 03:11:19.755477 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90966d16-8b8d-461f-9bb9-827f0d8cd48b-combined-ca-bundle\") pod \"90966d16-8b8d-461f-9bb9-827f0d8cd48b\" (UID: \"90966d16-8b8d-461f-9bb9-827f0d8cd48b\") " Nov 22 03:11:19 crc kubenswrapper[4952]: I1122 03:11:19.756671 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z67v\" (UniqueName: \"kubernetes.io/projected/90966d16-8b8d-461f-9bb9-827f0d8cd48b-kube-api-access-5z67v\") pod \"90966d16-8b8d-461f-9bb9-827f0d8cd48b\" (UID: \"90966d16-8b8d-461f-9bb9-827f0d8cd48b\") " Nov 22 03:11:19 crc kubenswrapper[4952]: I1122 03:11:19.763719 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90966d16-8b8d-461f-9bb9-827f0d8cd48b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "90966d16-8b8d-461f-9bb9-827f0d8cd48b" (UID: "90966d16-8b8d-461f-9bb9-827f0d8cd48b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:19 crc kubenswrapper[4952]: I1122 03:11:19.764513 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90966d16-8b8d-461f-9bb9-827f0d8cd48b-kube-api-access-5z67v" (OuterVolumeSpecName: "kube-api-access-5z67v") pod "90966d16-8b8d-461f-9bb9-827f0d8cd48b" (UID: "90966d16-8b8d-461f-9bb9-827f0d8cd48b"). InnerVolumeSpecName "kube-api-access-5z67v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:19 crc kubenswrapper[4952]: I1122 03:11:19.786636 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90966d16-8b8d-461f-9bb9-827f0d8cd48b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90966d16-8b8d-461f-9bb9-827f0d8cd48b" (UID: "90966d16-8b8d-461f-9bb9-827f0d8cd48b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:19 crc kubenswrapper[4952]: I1122 03:11:19.878349 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90966d16-8b8d-461f-9bb9-827f0d8cd48b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:19 crc kubenswrapper[4952]: I1122 03:11:19.878744 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z67v\" (UniqueName: \"kubernetes.io/projected/90966d16-8b8d-461f-9bb9-827f0d8cd48b-kube-api-access-5z67v\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:19 crc kubenswrapper[4952]: I1122 03:11:19.878891 4952 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/90966d16-8b8d-461f-9bb9-827f0d8cd48b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.121019 4952 generic.go:334] "Generic (PLEG): container finished" podID="dd05df7b-eac0-4a1e-b957-506c8a4c56c4" containerID="df095af4aaa28c2ce1bb66ab2fe7d4ee4861df62d886e6aa8ea343e5ff18893c" exitCode=0 Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.121201 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-msxp9" event={"ID":"dd05df7b-eac0-4a1e-b957-506c8a4c56c4","Type":"ContainerDied","Data":"df095af4aaa28c2ce1bb66ab2fe7d4ee4861df62d886e6aa8ea343e5ff18893c"} Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.124131 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b90cc371-f277-4e5c-9e95-5b8233d75503","Type":"ContainerStarted","Data":"3188a7d9e90a3aa3411929beeb33e7c08c31a29bb3ddf556d15cc57bb0d96696"} Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.126438 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8zw6r" event={"ID":"90966d16-8b8d-461f-9bb9-827f0d8cd48b","Type":"ContainerDied","Data":"443e925ee12353d211c95beef8d440b68cf50f13b51ed607f621c128df6047aa"} Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.126464 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="443e925ee12353d211c95beef8d440b68cf50f13b51ed607f621c128df6047aa" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.126508 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8zw6r" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.477974 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5c47d667f9-d4njv"] Nov 22 03:11:20 crc kubenswrapper[4952]: E1122 03:11:20.484405 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90966d16-8b8d-461f-9bb9-827f0d8cd48b" containerName="barbican-db-sync" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.484442 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="90966d16-8b8d-461f-9bb9-827f0d8cd48b" containerName="barbican-db-sync" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.485127 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="90966d16-8b8d-461f-9bb9-827f0d8cd48b" containerName="barbican-db-sync" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.492649 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5c47d667f9-d4njv" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.503256 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.503768 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.505347 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-pwbhn" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.523180 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5c47d667f9-d4njv"] Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.573676 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-f9d695d86-pd2dh"] Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.576073 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.579799 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.604102 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf053aa9-0d2f-486e-b410-89fa0afebaad-combined-ca-bundle\") pod \"barbican-worker-5c47d667f9-d4njv\" (UID: \"cf053aa9-0d2f-486e-b410-89fa0afebaad\") " pod="openstack/barbican-worker-5c47d667f9-d4njv" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.604147 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf053aa9-0d2f-486e-b410-89fa0afebaad-config-data-custom\") pod \"barbican-worker-5c47d667f9-d4njv\" (UID: \"cf053aa9-0d2f-486e-b410-89fa0afebaad\") " pod="openstack/barbican-worker-5c47d667f9-d4njv" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.604205 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6dd6\" (UniqueName: \"kubernetes.io/projected/cf053aa9-0d2f-486e-b410-89fa0afebaad-kube-api-access-p6dd6\") pod \"barbican-worker-5c47d667f9-d4njv\" (UID: \"cf053aa9-0d2f-486e-b410-89fa0afebaad\") " pod="openstack/barbican-worker-5c47d667f9-d4njv" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.604238 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf053aa9-0d2f-486e-b410-89fa0afebaad-logs\") pod \"barbican-worker-5c47d667f9-d4njv\" (UID: \"cf053aa9-0d2f-486e-b410-89fa0afebaad\") " pod="openstack/barbican-worker-5c47d667f9-d4njv" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.604294 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf053aa9-0d2f-486e-b410-89fa0afebaad-config-data\") pod \"barbican-worker-5c47d667f9-d4njv\" (UID: \"cf053aa9-0d2f-486e-b410-89fa0afebaad\") " pod="openstack/barbican-worker-5c47d667f9-d4njv" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.621667 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-f9d695d86-pd2dh"] Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.638171 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-52p5g"] Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.639951 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-52p5g" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.660103 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-52p5g"] Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.711192 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf053aa9-0d2f-486e-b410-89fa0afebaad-logs\") pod \"barbican-worker-5c47d667f9-d4njv\" (UID: \"cf053aa9-0d2f-486e-b410-89fa0afebaad\") " pod="openstack/barbican-worker-5c47d667f9-d4njv" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.711603 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19657c5b-43d6-45c6-802f-7fe7e6665f11-config-data-custom\") pod \"barbican-keystone-listener-f9d695d86-pd2dh\" (UID: \"19657c5b-43d6-45c6-802f-7fe7e6665f11\") " pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.711812 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td2zb\" (UniqueName: \"kubernetes.io/projected/19657c5b-43d6-45c6-802f-7fe7e6665f11-kube-api-access-td2zb\") pod \"barbican-keystone-listener-f9d695d86-pd2dh\" (UID: \"19657c5b-43d6-45c6-802f-7fe7e6665f11\") " pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.711864 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-52p5g\" (UID: \"d558a143-8d99-4b17-bc45-232698f7afb7\") " pod="openstack/dnsmasq-dns-6bb684768f-52p5g" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.711892 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf053aa9-0d2f-486e-b410-89fa0afebaad-config-data\") pod \"barbican-worker-5c47d667f9-d4njv\" (UID: \"cf053aa9-0d2f-486e-b410-89fa0afebaad\") " pod="openstack/barbican-worker-5c47d667f9-d4njv" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.711922 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-dns-svc\") pod \"dnsmasq-dns-6bb684768f-52p5g\" (UID: \"d558a143-8d99-4b17-bc45-232698f7afb7\") " pod="openstack/dnsmasq-dns-6bb684768f-52p5g" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.711953 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19657c5b-43d6-45c6-802f-7fe7e6665f11-combined-ca-bundle\") pod \"barbican-keystone-listener-f9d695d86-pd2dh\" (UID: \"19657c5b-43d6-45c6-802f-7fe7e6665f11\") " pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.712008 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-52p5g\" (UID: \"d558a143-8d99-4b17-bc45-232698f7afb7\") " pod="openstack/dnsmasq-dns-6bb684768f-52p5g" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.712038 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19657c5b-43d6-45c6-802f-7fe7e6665f11-config-data\") pod \"barbican-keystone-listener-f9d695d86-pd2dh\" (UID: \"19657c5b-43d6-45c6-802f-7fe7e6665f11\") " pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.712056 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf053aa9-0d2f-486e-b410-89fa0afebaad-combined-ca-bundle\") pod \"barbican-worker-5c47d667f9-d4njv\" (UID: \"cf053aa9-0d2f-486e-b410-89fa0afebaad\") " pod="openstack/barbican-worker-5c47d667f9-d4njv" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.712072 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djqvr\" (UniqueName: \"kubernetes.io/projected/d558a143-8d99-4b17-bc45-232698f7afb7-kube-api-access-djqvr\") pod \"dnsmasq-dns-6bb684768f-52p5g\" (UID: \"d558a143-8d99-4b17-bc45-232698f7afb7\") " pod="openstack/dnsmasq-dns-6bb684768f-52p5g" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.712096 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf053aa9-0d2f-486e-b410-89fa0afebaad-config-data-custom\") pod \"barbican-worker-5c47d667f9-d4njv\" (UID: \"cf053aa9-0d2f-486e-b410-89fa0afebaad\") " pod="openstack/barbican-worker-5c47d667f9-d4njv" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.716107 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-config\") pod \"dnsmasq-dns-6bb684768f-52p5g\" (UID: \"d558a143-8d99-4b17-bc45-232698f7afb7\") " pod="openstack/dnsmasq-dns-6bb684768f-52p5g" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.716179 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6dd6\" (UniqueName: \"kubernetes.io/projected/cf053aa9-0d2f-486e-b410-89fa0afebaad-kube-api-access-p6dd6\") pod \"barbican-worker-5c47d667f9-d4njv\" (UID: \"cf053aa9-0d2f-486e-b410-89fa0afebaad\") " pod="openstack/barbican-worker-5c47d667f9-d4njv" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.716199 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19657c5b-43d6-45c6-802f-7fe7e6665f11-logs\") pod \"barbican-keystone-listener-f9d695d86-pd2dh\" (UID: \"19657c5b-43d6-45c6-802f-7fe7e6665f11\") " pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.716870 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf053aa9-0d2f-486e-b410-89fa0afebaad-logs\") pod \"barbican-worker-5c47d667f9-d4njv\" (UID: \"cf053aa9-0d2f-486e-b410-89fa0afebaad\") " pod="openstack/barbican-worker-5c47d667f9-d4njv" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.728525 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5897f54458-5r5nc"] Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.733659 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.735467 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf053aa9-0d2f-486e-b410-89fa0afebaad-combined-ca-bundle\") pod \"barbican-worker-5c47d667f9-d4njv\" (UID: \"cf053aa9-0d2f-486e-b410-89fa0afebaad\") " pod="openstack/barbican-worker-5c47d667f9-d4njv" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.736620 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.736966 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf053aa9-0d2f-486e-b410-89fa0afebaad-config-data\") pod \"barbican-worker-5c47d667f9-d4njv\" (UID: \"cf053aa9-0d2f-486e-b410-89fa0afebaad\") " pod="openstack/barbican-worker-5c47d667f9-d4njv" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.742124 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5897f54458-5r5nc"] Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.742401 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf053aa9-0d2f-486e-b410-89fa0afebaad-config-data-custom\") pod \"barbican-worker-5c47d667f9-d4njv\" (UID: \"cf053aa9-0d2f-486e-b410-89fa0afebaad\") " pod="openstack/barbican-worker-5c47d667f9-d4njv" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.769356 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6dd6\" (UniqueName: \"kubernetes.io/projected/cf053aa9-0d2f-486e-b410-89fa0afebaad-kube-api-access-p6dd6\") pod \"barbican-worker-5c47d667f9-d4njv\" (UID: \"cf053aa9-0d2f-486e-b410-89fa0afebaad\") " pod="openstack/barbican-worker-5c47d667f9-d4njv" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.818390 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-config-data\") pod \"barbican-api-5897f54458-5r5nc\" (UID: \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\") " pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.818443 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22m9l\" (UniqueName: \"kubernetes.io/projected/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-kube-api-access-22m9l\") pod \"barbican-api-5897f54458-5r5nc\" (UID: \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\") " pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.818490 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-logs\") pod \"barbican-api-5897f54458-5r5nc\" (UID: \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\") " pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.818517 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-52p5g\" (UID: \"d558a143-8d99-4b17-bc45-232698f7afb7\") " pod="openstack/dnsmasq-dns-6bb684768f-52p5g" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.818572 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19657c5b-43d6-45c6-802f-7fe7e6665f11-config-data\") pod \"barbican-keystone-listener-f9d695d86-pd2dh\" (UID: \"19657c5b-43d6-45c6-802f-7fe7e6665f11\") " pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.818591 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-config-data-custom\") pod \"barbican-api-5897f54458-5r5nc\" (UID: \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\") " pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.818613 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djqvr\" (UniqueName: \"kubernetes.io/projected/d558a143-8d99-4b17-bc45-232698f7afb7-kube-api-access-djqvr\") pod \"dnsmasq-dns-6bb684768f-52p5g\" (UID: \"d558a143-8d99-4b17-bc45-232698f7afb7\") " pod="openstack/dnsmasq-dns-6bb684768f-52p5g" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.818641 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-config\") pod \"dnsmasq-dns-6bb684768f-52p5g\" (UID: \"d558a143-8d99-4b17-bc45-232698f7afb7\") " pod="openstack/dnsmasq-dns-6bb684768f-52p5g" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.818674 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19657c5b-43d6-45c6-802f-7fe7e6665f11-logs\") pod \"barbican-keystone-listener-f9d695d86-pd2dh\" (UID: \"19657c5b-43d6-45c6-802f-7fe7e6665f11\") " pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.818706 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-combined-ca-bundle\") pod \"barbican-api-5897f54458-5r5nc\" (UID: \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\") " pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.818727 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19657c5b-43d6-45c6-802f-7fe7e6665f11-config-data-custom\") pod \"barbican-keystone-listener-f9d695d86-pd2dh\" (UID: \"19657c5b-43d6-45c6-802f-7fe7e6665f11\") " pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.818756 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td2zb\" (UniqueName: \"kubernetes.io/projected/19657c5b-43d6-45c6-802f-7fe7e6665f11-kube-api-access-td2zb\") pod \"barbican-keystone-listener-f9d695d86-pd2dh\" (UID: \"19657c5b-43d6-45c6-802f-7fe7e6665f11\") " pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.818774 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-52p5g\" (UID: \"d558a143-8d99-4b17-bc45-232698f7afb7\") " pod="openstack/dnsmasq-dns-6bb684768f-52p5g" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.818813 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-dns-svc\") pod \"dnsmasq-dns-6bb684768f-52p5g\" (UID: \"d558a143-8d99-4b17-bc45-232698f7afb7\") " pod="openstack/dnsmasq-dns-6bb684768f-52p5g" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.818845 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19657c5b-43d6-45c6-802f-7fe7e6665f11-combined-ca-bundle\") pod \"barbican-keystone-listener-f9d695d86-pd2dh\" (UID: \"19657c5b-43d6-45c6-802f-7fe7e6665f11\") " pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.820782 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-config\") pod \"dnsmasq-dns-6bb684768f-52p5g\" (UID: \"d558a143-8d99-4b17-bc45-232698f7afb7\") " pod="openstack/dnsmasq-dns-6bb684768f-52p5g" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.821117 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-52p5g\" (UID: \"d558a143-8d99-4b17-bc45-232698f7afb7\") " pod="openstack/dnsmasq-dns-6bb684768f-52p5g" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.821353 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19657c5b-43d6-45c6-802f-7fe7e6665f11-logs\") pod \"barbican-keystone-listener-f9d695d86-pd2dh\" (UID: \"19657c5b-43d6-45c6-802f-7fe7e6665f11\") " pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.821660 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-52p5g\" (UID: \"d558a143-8d99-4b17-bc45-232698f7afb7\") " pod="openstack/dnsmasq-dns-6bb684768f-52p5g" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.821974 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-dns-svc\") pod \"dnsmasq-dns-6bb684768f-52p5g\" (UID: \"d558a143-8d99-4b17-bc45-232698f7afb7\") " pod="openstack/dnsmasq-dns-6bb684768f-52p5g" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.824399 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5c47d667f9-d4njv" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.825634 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19657c5b-43d6-45c6-802f-7fe7e6665f11-config-data-custom\") pod \"barbican-keystone-listener-f9d695d86-pd2dh\" (UID: \"19657c5b-43d6-45c6-802f-7fe7e6665f11\") " pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.828756 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19657c5b-43d6-45c6-802f-7fe7e6665f11-combined-ca-bundle\") pod \"barbican-keystone-listener-f9d695d86-pd2dh\" (UID: \"19657c5b-43d6-45c6-802f-7fe7e6665f11\") " pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.832635 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19657c5b-43d6-45c6-802f-7fe7e6665f11-config-data\") pod \"barbican-keystone-listener-f9d695d86-pd2dh\" (UID: \"19657c5b-43d6-45c6-802f-7fe7e6665f11\") " pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.839904 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djqvr\" (UniqueName: \"kubernetes.io/projected/d558a143-8d99-4b17-bc45-232698f7afb7-kube-api-access-djqvr\") pod \"dnsmasq-dns-6bb684768f-52p5g\" (UID: \"d558a143-8d99-4b17-bc45-232698f7afb7\") " pod="openstack/dnsmasq-dns-6bb684768f-52p5g" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.841283 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td2zb\" (UniqueName: \"kubernetes.io/projected/19657c5b-43d6-45c6-802f-7fe7e6665f11-kube-api-access-td2zb\") pod \"barbican-keystone-listener-f9d695d86-pd2dh\" (UID: \"19657c5b-43d6-45c6-802f-7fe7e6665f11\") " pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.920176 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-combined-ca-bundle\") pod \"barbican-api-5897f54458-5r5nc\" (UID: \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\") " pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.920293 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-config-data\") pod \"barbican-api-5897f54458-5r5nc\" (UID: \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\") " pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.920320 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22m9l\" (UniqueName: \"kubernetes.io/projected/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-kube-api-access-22m9l\") pod \"barbican-api-5897f54458-5r5nc\" (UID: \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\") " pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.920355 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-logs\") pod \"barbican-api-5897f54458-5r5nc\" (UID: \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\") " pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.920390 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-config-data-custom\") pod \"barbican-api-5897f54458-5r5nc\" (UID: \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\") " pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.921197 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-logs\") pod \"barbican-api-5897f54458-5r5nc\" (UID: \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\") " pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.924464 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-config-data-custom\") pod \"barbican-api-5897f54458-5r5nc\" (UID: \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\") " pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.931081 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-config-data\") pod \"barbican-api-5897f54458-5r5nc\" (UID: \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\") " pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.931180 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-combined-ca-bundle\") pod \"barbican-api-5897f54458-5r5nc\" (UID: \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\") " pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.939231 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22m9l\" (UniqueName: \"kubernetes.io/projected/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-kube-api-access-22m9l\") pod \"barbican-api-5897f54458-5r5nc\" (UID: \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\") " pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.964173 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" Nov 22 03:11:20 crc kubenswrapper[4952]: I1122 03:11:20.978387 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-52p5g" Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.159196 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b90cc371-f277-4e5c-9e95-5b8233d75503","Type":"ContainerStarted","Data":"719789bbe8a533e735cafc09236a0eb8434a6c4520aa604f6afd472ee1e89034"} Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.159270 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b90cc371-f277-4e5c-9e95-5b8233d75503","Type":"ContainerStarted","Data":"a523636e7e2301571fa6df70c565c588832cbdf72da50d7ee998a2bf1a992aa4"} Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.232183 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.310589 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5c47d667f9-d4njv"] Nov 22 03:11:21 crc kubenswrapper[4952]: W1122 03:11:21.315722 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf053aa9_0d2f_486e_b410_89fa0afebaad.slice/crio-918ba3948e942b010e091a2a2ca06cde70574d442af395bc027d78460322afe4 WatchSource:0}: Error finding container 918ba3948e942b010e091a2a2ca06cde70574d442af395bc027d78460322afe4: Status 404 returned error can't find the container with id 918ba3948e942b010e091a2a2ca06cde70574d442af395bc027d78460322afe4 Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.528314 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-f9d695d86-pd2dh"] Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.540909 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-msxp9" Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.642622 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-config-data\") pod \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.643353 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcfsp\" (UniqueName: \"kubernetes.io/projected/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-kube-api-access-qcfsp\") pod \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.643392 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-db-sync-config-data\") pod \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.643612 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-combined-ca-bundle\") pod \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.643700 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-etc-machine-id\") pod \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.643809 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-scripts\") pod \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\" (UID: \"dd05df7b-eac0-4a1e-b957-506c8a4c56c4\") " Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.647123 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dd05df7b-eac0-4a1e-b957-506c8a4c56c4" (UID: "dd05df7b-eac0-4a1e-b957-506c8a4c56c4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.650760 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-kube-api-access-qcfsp" (OuterVolumeSpecName: "kube-api-access-qcfsp") pod "dd05df7b-eac0-4a1e-b957-506c8a4c56c4" (UID: "dd05df7b-eac0-4a1e-b957-506c8a4c56c4"). InnerVolumeSpecName "kube-api-access-qcfsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.658626 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dd05df7b-eac0-4a1e-b957-506c8a4c56c4" (UID: "dd05df7b-eac0-4a1e-b957-506c8a4c56c4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.659347 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-scripts" (OuterVolumeSpecName: "scripts") pod "dd05df7b-eac0-4a1e-b957-506c8a4c56c4" (UID: "dd05df7b-eac0-4a1e-b957-506c8a4c56c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.682170 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd05df7b-eac0-4a1e-b957-506c8a4c56c4" (UID: "dd05df7b-eac0-4a1e-b957-506c8a4c56c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.692004 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-52p5g"] Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.723157 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-config-data" (OuterVolumeSpecName: "config-data") pod "dd05df7b-eac0-4a1e-b957-506c8a4c56c4" (UID: "dd05df7b-eac0-4a1e-b957-506c8a4c56c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.747123 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcfsp\" (UniqueName: \"kubernetes.io/projected/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-kube-api-access-qcfsp\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.747168 4952 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.747180 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.747192 4952 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.747206 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.747217 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd05df7b-eac0-4a1e-b957-506c8a4c56c4-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:21 crc kubenswrapper[4952]: I1122 03:11:21.848814 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5897f54458-5r5nc"] Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.192608 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" event={"ID":"19657c5b-43d6-45c6-802f-7fe7e6665f11","Type":"ContainerStarted","Data":"896481c54e80d5331a0ba8dfb752b0919fb7ce8eebea08514e193da9cee43215"} Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.212882 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-msxp9" event={"ID":"dd05df7b-eac0-4a1e-b957-506c8a4c56c4","Type":"ContainerDied","Data":"3eb45cae9e6d8630e3fdefc9ab99e45325fa0afae3598c9380fa313f6a207d80"} Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.212945 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3eb45cae9e6d8630e3fdefc9ab99e45325fa0afae3598c9380fa313f6a207d80" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.213033 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-msxp9" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.241401 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c47d667f9-d4njv" event={"ID":"cf053aa9-0d2f-486e-b410-89fa0afebaad","Type":"ContainerStarted","Data":"918ba3948e942b010e091a2a2ca06cde70574d442af395bc027d78460322afe4"} Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.247111 4952 generic.go:334] "Generic (PLEG): container finished" podID="d558a143-8d99-4b17-bc45-232698f7afb7" containerID="12f9837539de1a1ee3fd6929b0cdc40c47f73a4d11d59901d377ddf67404aeeb" exitCode=0 Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.247201 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-52p5g" event={"ID":"d558a143-8d99-4b17-bc45-232698f7afb7","Type":"ContainerDied","Data":"12f9837539de1a1ee3fd6929b0cdc40c47f73a4d11d59901d377ddf67404aeeb"} Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.247233 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-52p5g" event={"ID":"d558a143-8d99-4b17-bc45-232698f7afb7","Type":"ContainerStarted","Data":"c3c61bb38572fe79a59c0307492e1b60f11744a15b9c9c5079a71f802ebdb1ef"} Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.256511 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b90cc371-f277-4e5c-9e95-5b8233d75503","Type":"ContainerStarted","Data":"4563323ac5588f3f9f959c00bdc334a9fd0fdb751a15e8b2dadb669cb65144c8"} Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.263623 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897f54458-5r5nc" event={"ID":"16fdf15b-0bbd-440a-835b-0ef98ef6b28d","Type":"ContainerStarted","Data":"4b2b791268158fda57c42dd867cd67f2920b83cbe46877eb93841f19645345c2"} Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.263679 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897f54458-5r5nc" event={"ID":"16fdf15b-0bbd-440a-835b-0ef98ef6b28d","Type":"ContainerStarted","Data":"72c0832aa3513d79261c762aad654265562c78afe4e99ad876d363929a2f9b1e"} Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.448894 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 03:11:22 crc kubenswrapper[4952]: E1122 03:11:22.449394 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd05df7b-eac0-4a1e-b957-506c8a4c56c4" containerName="cinder-db-sync" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.449410 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd05df7b-eac0-4a1e-b957-506c8a4c56c4" containerName="cinder-db-sync" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.449760 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd05df7b-eac0-4a1e-b957-506c8a4c56c4" containerName="cinder-db-sync" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.451064 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.454215 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.454661 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.454779 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-qv46f" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.468946 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.469121 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.590954 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.591786 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.591843 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr7kd\" (UniqueName: \"kubernetes.io/projected/ed45c0be-9309-43ec-a50b-0d6f4169d869-kube-api-access-dr7kd\") pod \"cinder-scheduler-0\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.591865 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ed45c0be-9309-43ec-a50b-0d6f4169d869-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.591954 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-config-data\") pod \"cinder-scheduler-0\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.591982 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-scripts\") pod \"cinder-scheduler-0\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.599402 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-52p5g"] Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.606005 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-zxqbc"] Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.608167 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.644331 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-zxqbc"] Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.694202 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-config-data\") pod \"cinder-scheduler-0\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.694264 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-zxqbc\" (UID: \"e8990647-b688-4c7d-acfa-c1287965cc3d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.694294 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-scripts\") pod \"cinder-scheduler-0\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.694363 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-zxqbc\" (UID: \"e8990647-b688-4c7d-acfa-c1287965cc3d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.694390 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.694414 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5l7w\" (UniqueName: \"kubernetes.io/projected/e8990647-b688-4c7d-acfa-c1287965cc3d-kube-api-access-c5l7w\") pod \"dnsmasq-dns-6d97fcdd8f-zxqbc\" (UID: \"e8990647-b688-4c7d-acfa-c1287965cc3d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.694433 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.694469 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr7kd\" (UniqueName: \"kubernetes.io/projected/ed45c0be-9309-43ec-a50b-0d6f4169d869-kube-api-access-dr7kd\") pod \"cinder-scheduler-0\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.694488 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ed45c0be-9309-43ec-a50b-0d6f4169d869-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.694530 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-config\") pod \"dnsmasq-dns-6d97fcdd8f-zxqbc\" (UID: \"e8990647-b688-4c7d-acfa-c1287965cc3d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.694569 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-zxqbc\" (UID: \"e8990647-b688-4c7d-acfa-c1287965cc3d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.706045 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.706390 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-scripts\") pod \"cinder-scheduler-0\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.706518 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ed45c0be-9309-43ec-a50b-0d6f4169d869-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.719060 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-config-data\") pod \"cinder-scheduler-0\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.719855 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.728508 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr7kd\" (UniqueName: \"kubernetes.io/projected/ed45c0be-9309-43ec-a50b-0d6f4169d869-kube-api-access-dr7kd\") pod \"cinder-scheduler-0\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.728839 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.730472 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.737871 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.755950 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.791273 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.796860 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-config\") pod \"dnsmasq-dns-6d97fcdd8f-zxqbc\" (UID: \"e8990647-b688-4c7d-acfa-c1287965cc3d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.797013 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-zxqbc\" (UID: \"e8990647-b688-4c7d-acfa-c1287965cc3d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.797125 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-zxqbc\" (UID: \"e8990647-b688-4c7d-acfa-c1287965cc3d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.797316 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-zxqbc\" (UID: \"e8990647-b688-4c7d-acfa-c1287965cc3d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.797436 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5l7w\" (UniqueName: \"kubernetes.io/projected/e8990647-b688-4c7d-acfa-c1287965cc3d-kube-api-access-c5l7w\") pod \"dnsmasq-dns-6d97fcdd8f-zxqbc\" (UID: \"e8990647-b688-4c7d-acfa-c1287965cc3d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.797885 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-config\") pod \"dnsmasq-dns-6d97fcdd8f-zxqbc\" (UID: \"e8990647-b688-4c7d-acfa-c1287965cc3d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.800612 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-zxqbc\" (UID: \"e8990647-b688-4c7d-acfa-c1287965cc3d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.801021 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-zxqbc\" (UID: \"e8990647-b688-4c7d-acfa-c1287965cc3d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.801807 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-zxqbc\" (UID: \"e8990647-b688-4c7d-acfa-c1287965cc3d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.830649 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5l7w\" (UniqueName: \"kubernetes.io/projected/e8990647-b688-4c7d-acfa-c1287965cc3d-kube-api-access-c5l7w\") pod \"dnsmasq-dns-6d97fcdd8f-zxqbc\" (UID: \"e8990647-b688-4c7d-acfa-c1287965cc3d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.901680 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.902064 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-config-data\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.902186 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjqnx\" (UniqueName: \"kubernetes.io/projected/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-kube-api-access-wjqnx\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.902287 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-logs\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.902390 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-config-data-custom\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.902652 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-scripts\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.902917 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:22 crc kubenswrapper[4952]: I1122 03:11:22.961238 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:11:23 crc kubenswrapper[4952]: I1122 03:11:23.005591 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:23 crc kubenswrapper[4952]: I1122 03:11:23.005669 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-config-data\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:23 crc kubenswrapper[4952]: I1122 03:11:23.005697 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjqnx\" (UniqueName: \"kubernetes.io/projected/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-kube-api-access-wjqnx\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:23 crc kubenswrapper[4952]: I1122 03:11:23.005714 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-logs\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:23 crc kubenswrapper[4952]: I1122 03:11:23.005738 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-config-data-custom\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:23 crc kubenswrapper[4952]: I1122 03:11:23.005737 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:23 crc kubenswrapper[4952]: I1122 03:11:23.005778 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-scripts\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:23 crc kubenswrapper[4952]: I1122 03:11:23.006730 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-logs\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:23 crc kubenswrapper[4952]: I1122 03:11:23.006971 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:23 crc kubenswrapper[4952]: I1122 03:11:23.012018 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-scripts\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:23 crc kubenswrapper[4952]: I1122 03:11:23.015514 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-config-data-custom\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:23 crc kubenswrapper[4952]: I1122 03:11:23.015665 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-config-data\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:23 crc kubenswrapper[4952]: I1122 03:11:23.026983 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:23 crc kubenswrapper[4952]: I1122 03:11:23.029729 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjqnx\" (UniqueName: \"kubernetes.io/projected/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-kube-api-access-wjqnx\") pod \"cinder-api-0\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " pod="openstack/cinder-api-0" Nov 22 03:11:23 crc kubenswrapper[4952]: I1122 03:11:23.111432 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 03:11:23 crc kubenswrapper[4952]: I1122 03:11:23.281306 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897f54458-5r5nc" event={"ID":"16fdf15b-0bbd-440a-835b-0ef98ef6b28d","Type":"ContainerStarted","Data":"0a2dfd5a8c288721fb08a0bee8d534b05b0460ed66ad06257a8dec074ed88e3f"} Nov 22 03:11:23 crc kubenswrapper[4952]: I1122 03:11:23.281611 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:23 crc kubenswrapper[4952]: I1122 03:11:23.281662 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:23 crc kubenswrapper[4952]: I1122 03:11:23.311082 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5897f54458-5r5nc" podStartSLOduration=3.311046207 podStartE2EDuration="3.311046207s" podCreationTimestamp="2025-11-22 03:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:23.306344471 +0000 UTC m=+1047.612361764" watchObservedRunningTime="2025-11-22 03:11:23.311046207 +0000 UTC m=+1047.617063480" Nov 22 03:11:24 crc kubenswrapper[4952]: I1122 03:11:24.881317 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 03:11:24 crc kubenswrapper[4952]: I1122 03:11:24.968222 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 03:11:24 crc kubenswrapper[4952]: I1122 03:11:24.986309 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-zxqbc"] Nov 22 03:11:25 crc kubenswrapper[4952]: W1122 03:11:25.015440 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded45c0be_9309_43ec_a50b_0d6f4169d869.slice/crio-183b9d9cc912248ec1f7de9de30b6c294ed8df21f110c8eaa2f5bf33b6e684cc WatchSource:0}: Error finding container 183b9d9cc912248ec1f7de9de30b6c294ed8df21f110c8eaa2f5bf33b6e684cc: Status 404 returned error can't find the container with id 183b9d9cc912248ec1f7de9de30b6c294ed8df21f110c8eaa2f5bf33b6e684cc Nov 22 03:11:25 crc kubenswrapper[4952]: I1122 03:11:25.305197 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b90cc371-f277-4e5c-9e95-5b8233d75503","Type":"ContainerStarted","Data":"51d900984d182c81497d901cd7fe4146c0ead69e15ad2229678df2eb2b4ee2eb"} Nov 22 03:11:25 crc kubenswrapper[4952]: I1122 03:11:25.305808 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 03:11:25 crc kubenswrapper[4952]: I1122 03:11:25.307579 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0964838d-cfe7-4e8c-a2ee-56a92b0775b4","Type":"ContainerStarted","Data":"7d82e69b7a9692a6b105f96eb4111acadd26fc417a3e3347fb5bf42b8d766f1e"} Nov 22 03:11:25 crc kubenswrapper[4952]: I1122 03:11:25.311471 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" event={"ID":"19657c5b-43d6-45c6-802f-7fe7e6665f11","Type":"ContainerStarted","Data":"9693210d0137d7874388a3bbe7e05ec02ea71710fedbcded168da127423cf5d1"} Nov 22 03:11:25 crc kubenswrapper[4952]: I1122 03:11:25.311535 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" event={"ID":"19657c5b-43d6-45c6-802f-7fe7e6665f11","Type":"ContainerStarted","Data":"b7ba04ebd6594d3741efd19eab023cf572dcaa9c3fe55e7b8710c28a95c70023"} Nov 22 03:11:25 crc kubenswrapper[4952]: I1122 03:11:25.315882 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ed45c0be-9309-43ec-a50b-0d6f4169d869","Type":"ContainerStarted","Data":"183b9d9cc912248ec1f7de9de30b6c294ed8df21f110c8eaa2f5bf33b6e684cc"} Nov 22 03:11:25 crc kubenswrapper[4952]: I1122 03:11:25.323664 4952 generic.go:334] "Generic (PLEG): container finished" podID="e8990647-b688-4c7d-acfa-c1287965cc3d" containerID="97056bb7e99ad605610567594885ab13d6994809d9e18faee8fc928139df8768" exitCode=0 Nov 22 03:11:25 crc kubenswrapper[4952]: I1122 03:11:25.323827 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" event={"ID":"e8990647-b688-4c7d-acfa-c1287965cc3d","Type":"ContainerDied","Data":"97056bb7e99ad605610567594885ab13d6994809d9e18faee8fc928139df8768"} Nov 22 03:11:25 crc kubenswrapper[4952]: I1122 03:11:25.323867 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" event={"ID":"e8990647-b688-4c7d-acfa-c1287965cc3d","Type":"ContainerStarted","Data":"e32d7c8708ea591c95d4533b9c171de99de6d34a630ec51a502cc64ad542c354"} Nov 22 03:11:25 crc kubenswrapper[4952]: I1122 03:11:25.337000 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.6567315110000003 podStartE2EDuration="7.336981151s" podCreationTimestamp="2025-11-22 03:11:18 +0000 UTC" firstStartedPulling="2025-11-22 03:11:19.69482458 +0000 UTC m=+1044.000841853" lastFinishedPulling="2025-11-22 03:11:24.37507422 +0000 UTC m=+1048.681091493" observedRunningTime="2025-11-22 03:11:25.329979053 +0000 UTC m=+1049.635996326" watchObservedRunningTime="2025-11-22 03:11:25.336981151 +0000 UTC m=+1049.642998424" Nov 22 03:11:25 crc kubenswrapper[4952]: I1122 03:11:25.340663 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb684768f-52p5g" podUID="d558a143-8d99-4b17-bc45-232698f7afb7" containerName="dnsmasq-dns" containerID="cri-o://72381832469f5ed9a461900da15d695e8bfa25416b3751a71994dafdf02e2075" gracePeriod=10 Nov 22 03:11:25 crc kubenswrapper[4952]: I1122 03:11:25.340952 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-52p5g" event={"ID":"d558a143-8d99-4b17-bc45-232698f7afb7","Type":"ContainerStarted","Data":"72381832469f5ed9a461900da15d695e8bfa25416b3751a71994dafdf02e2075"} Nov 22 03:11:25 crc kubenswrapper[4952]: I1122 03:11:25.341007 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb684768f-52p5g" Nov 22 03:11:25 crc kubenswrapper[4952]: I1122 03:11:25.346759 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c47d667f9-d4njv" event={"ID":"cf053aa9-0d2f-486e-b410-89fa0afebaad","Type":"ContainerStarted","Data":"fadd65c5e35709a942f6f024cb5ce8c20a1a965c10d33fa0205f0d4f5a38efcb"} Nov 22 03:11:25 crc kubenswrapper[4952]: I1122 03:11:25.346819 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c47d667f9-d4njv" event={"ID":"cf053aa9-0d2f-486e-b410-89fa0afebaad","Type":"ContainerStarted","Data":"717aead66491c96e53c8df4a4ff73955f0b44487c9a37e03393db06566b88822"} Nov 22 03:11:25 crc kubenswrapper[4952]: I1122 03:11:25.402713 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-f9d695d86-pd2dh" podStartSLOduration=2.590185322 podStartE2EDuration="5.402690392s" podCreationTimestamp="2025-11-22 03:11:20 +0000 UTC" firstStartedPulling="2025-11-22 03:11:21.532271074 +0000 UTC m=+1045.838288347" lastFinishedPulling="2025-11-22 03:11:24.344776144 +0000 UTC m=+1048.650793417" observedRunningTime="2025-11-22 03:11:25.397810761 +0000 UTC m=+1049.703828034" watchObservedRunningTime="2025-11-22 03:11:25.402690392 +0000 UTC m=+1049.708707655" Nov 22 03:11:25 crc kubenswrapper[4952]: I1122 03:11:25.431593 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb684768f-52p5g" podStartSLOduration=5.4315650699999996 podStartE2EDuration="5.43156507s" podCreationTimestamp="2025-11-22 03:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:25.425195818 +0000 UTC m=+1049.731213091" watchObservedRunningTime="2025-11-22 03:11:25.43156507 +0000 UTC m=+1049.737582343" Nov 22 03:11:25 crc kubenswrapper[4952]: I1122 03:11:25.452818 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5c47d667f9-d4njv" podStartSLOduration=2.40975949 podStartE2EDuration="5.452790712s" podCreationTimestamp="2025-11-22 03:11:20 +0000 UTC" firstStartedPulling="2025-11-22 03:11:21.318026491 +0000 UTC m=+1045.624043764" lastFinishedPulling="2025-11-22 03:11:24.361057713 +0000 UTC m=+1048.667074986" observedRunningTime="2025-11-22 03:11:25.440740667 +0000 UTC m=+1049.746757940" watchObservedRunningTime="2025-11-22 03:11:25.452790712 +0000 UTC m=+1049.758807975" Nov 22 03:11:25 crc kubenswrapper[4952]: I1122 03:11:25.981236 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.123607 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-52p5g" Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.240283 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djqvr\" (UniqueName: \"kubernetes.io/projected/d558a143-8d99-4b17-bc45-232698f7afb7-kube-api-access-djqvr\") pod \"d558a143-8d99-4b17-bc45-232698f7afb7\" (UID: \"d558a143-8d99-4b17-bc45-232698f7afb7\") " Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.240395 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-config\") pod \"d558a143-8d99-4b17-bc45-232698f7afb7\" (UID: \"d558a143-8d99-4b17-bc45-232698f7afb7\") " Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.240444 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-ovsdbserver-nb\") pod \"d558a143-8d99-4b17-bc45-232698f7afb7\" (UID: \"d558a143-8d99-4b17-bc45-232698f7afb7\") " Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.240685 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-dns-svc\") pod \"d558a143-8d99-4b17-bc45-232698f7afb7\" (UID: \"d558a143-8d99-4b17-bc45-232698f7afb7\") " Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.240822 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-ovsdbserver-sb\") pod \"d558a143-8d99-4b17-bc45-232698f7afb7\" (UID: \"d558a143-8d99-4b17-bc45-232698f7afb7\") " Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.247272 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d558a143-8d99-4b17-bc45-232698f7afb7-kube-api-access-djqvr" (OuterVolumeSpecName: "kube-api-access-djqvr") pod "d558a143-8d99-4b17-bc45-232698f7afb7" (UID: "d558a143-8d99-4b17-bc45-232698f7afb7"). InnerVolumeSpecName "kube-api-access-djqvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.296632 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d558a143-8d99-4b17-bc45-232698f7afb7" (UID: "d558a143-8d99-4b17-bc45-232698f7afb7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.296729 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d558a143-8d99-4b17-bc45-232698f7afb7" (UID: "d558a143-8d99-4b17-bc45-232698f7afb7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.325064 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-config" (OuterVolumeSpecName: "config") pod "d558a143-8d99-4b17-bc45-232698f7afb7" (UID: "d558a143-8d99-4b17-bc45-232698f7afb7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.338806 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d558a143-8d99-4b17-bc45-232698f7afb7" (UID: "d558a143-8d99-4b17-bc45-232698f7afb7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.342761 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.342791 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djqvr\" (UniqueName: \"kubernetes.io/projected/d558a143-8d99-4b17-bc45-232698f7afb7-kube-api-access-djqvr\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.342804 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.342813 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.342824 4952 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d558a143-8d99-4b17-bc45-232698f7afb7-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.367761 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" event={"ID":"e8990647-b688-4c7d-acfa-c1287965cc3d","Type":"ContainerStarted","Data":"f735e92f8358d3f5f801545fbc4d52dc6357804eb16e8398ceffe500e44d7711"} Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.369107 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.371476 4952 generic.go:334] "Generic (PLEG): container finished" podID="d558a143-8d99-4b17-bc45-232698f7afb7" containerID="72381832469f5ed9a461900da15d695e8bfa25416b3751a71994dafdf02e2075" exitCode=0 Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.371532 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-52p5g" event={"ID":"d558a143-8d99-4b17-bc45-232698f7afb7","Type":"ContainerDied","Data":"72381832469f5ed9a461900da15d695e8bfa25416b3751a71994dafdf02e2075"} Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.371575 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-52p5g" event={"ID":"d558a143-8d99-4b17-bc45-232698f7afb7","Type":"ContainerDied","Data":"c3c61bb38572fe79a59c0307492e1b60f11744a15b9c9c5079a71f802ebdb1ef"} Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.371593 4952 scope.go:117] "RemoveContainer" containerID="72381832469f5ed9a461900da15d695e8bfa25416b3751a71994dafdf02e2075" Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.371691 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-52p5g" Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.407740 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0964838d-cfe7-4e8c-a2ee-56a92b0775b4","Type":"ContainerStarted","Data":"5efc3c0a441a05ad7941acf45c4376e3ca1396b3bfee0f08be8d2bf6c074a0c4"} Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.410299 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" podStartSLOduration=4.410280944 podStartE2EDuration="4.410280944s" podCreationTimestamp="2025-11-22 03:11:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:26.408499187 +0000 UTC m=+1050.714516460" watchObservedRunningTime="2025-11-22 03:11:26.410280944 +0000 UTC m=+1050.716298217" Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.435119 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-52p5g"] Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.441738 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-52p5g"] Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.462336 4952 scope.go:117] "RemoveContainer" containerID="12f9837539de1a1ee3fd6929b0cdc40c47f73a4d11d59901d377ddf67404aeeb" Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.495374 4952 scope.go:117] "RemoveContainer" containerID="72381832469f5ed9a461900da15d695e8bfa25416b3751a71994dafdf02e2075" Nov 22 03:11:26 crc kubenswrapper[4952]: E1122 03:11:26.496254 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72381832469f5ed9a461900da15d695e8bfa25416b3751a71994dafdf02e2075\": container with ID starting with 72381832469f5ed9a461900da15d695e8bfa25416b3751a71994dafdf02e2075 not found: ID does not exist" containerID="72381832469f5ed9a461900da15d695e8bfa25416b3751a71994dafdf02e2075" Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.496301 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72381832469f5ed9a461900da15d695e8bfa25416b3751a71994dafdf02e2075"} err="failed to get container status \"72381832469f5ed9a461900da15d695e8bfa25416b3751a71994dafdf02e2075\": rpc error: code = NotFound desc = could not find container \"72381832469f5ed9a461900da15d695e8bfa25416b3751a71994dafdf02e2075\": container with ID starting with 72381832469f5ed9a461900da15d695e8bfa25416b3751a71994dafdf02e2075 not found: ID does not exist" Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.496446 4952 scope.go:117] "RemoveContainer" containerID="12f9837539de1a1ee3fd6929b0cdc40c47f73a4d11d59901d377ddf67404aeeb" Nov 22 03:11:26 crc kubenswrapper[4952]: E1122 03:11:26.496794 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12f9837539de1a1ee3fd6929b0cdc40c47f73a4d11d59901d377ddf67404aeeb\": container with ID starting with 12f9837539de1a1ee3fd6929b0cdc40c47f73a4d11d59901d377ddf67404aeeb not found: ID does not exist" containerID="12f9837539de1a1ee3fd6929b0cdc40c47f73a4d11d59901d377ddf67404aeeb" Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.496822 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12f9837539de1a1ee3fd6929b0cdc40c47f73a4d11d59901d377ddf67404aeeb"} err="failed to get container status \"12f9837539de1a1ee3fd6929b0cdc40c47f73a4d11d59901d377ddf67404aeeb\": rpc error: code = NotFound desc = could not find container \"12f9837539de1a1ee3fd6929b0cdc40c47f73a4d11d59901d377ddf67404aeeb\": container with ID starting with 12f9837539de1a1ee3fd6929b0cdc40c47f73a4d11d59901d377ddf67404aeeb not found: ID does not exist" Nov 22 03:11:26 crc kubenswrapper[4952]: I1122 03:11:26.558106 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d558a143-8d99-4b17-bc45-232698f7afb7" path="/var/lib/kubelet/pods/d558a143-8d99-4b17-bc45-232698f7afb7/volumes" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.408134 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ed45c0be-9309-43ec-a50b-0d6f4169d869","Type":"ContainerStarted","Data":"eae47ed545e0b70c4a8d0c0bfa87c081d539bc6d4fb1d419fecef56dffaf4d4a"} Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.415736 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0964838d-cfe7-4e8c-a2ee-56a92b0775b4","Type":"ContainerStarted","Data":"32e0be4fd3625c986694f752e82fe72c4be2490d5d4185e7ec969c11edb03923"} Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.416057 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="0964838d-cfe7-4e8c-a2ee-56a92b0775b4" containerName="cinder-api-log" containerID="cri-o://5efc3c0a441a05ad7941acf45c4376e3ca1396b3bfee0f08be8d2bf6c074a0c4" gracePeriod=30 Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.416250 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="0964838d-cfe7-4e8c-a2ee-56a92b0775b4" containerName="cinder-api" containerID="cri-o://32e0be4fd3625c986694f752e82fe72c4be2490d5d4185e7ec969c11edb03923" gracePeriod=30 Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.416475 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.452235 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.452208732 podStartE2EDuration="5.452208732s" podCreationTimestamp="2025-11-22 03:11:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:27.445323436 +0000 UTC m=+1051.751340739" watchObservedRunningTime="2025-11-22 03:11:27.452208732 +0000 UTC m=+1051.758225995" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.713885 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-746b4d95d8-xhmsq"] Nov 22 03:11:27 crc kubenswrapper[4952]: E1122 03:11:27.714497 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d558a143-8d99-4b17-bc45-232698f7afb7" containerName="dnsmasq-dns" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.714521 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="d558a143-8d99-4b17-bc45-232698f7afb7" containerName="dnsmasq-dns" Nov 22 03:11:27 crc kubenswrapper[4952]: E1122 03:11:27.714559 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d558a143-8d99-4b17-bc45-232698f7afb7" containerName="init" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.714567 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="d558a143-8d99-4b17-bc45-232698f7afb7" containerName="init" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.714796 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="d558a143-8d99-4b17-bc45-232698f7afb7" containerName="dnsmasq-dns" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.716761 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.720096 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.721669 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.725401 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-746b4d95d8-xhmsq"] Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.775195 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b0307be-c90c-45a1-bef7-32bf07c7e35e-config-data\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.775280 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b0307be-c90c-45a1-bef7-32bf07c7e35e-public-tls-certs\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.775359 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b0307be-c90c-45a1-bef7-32bf07c7e35e-combined-ca-bundle\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.775388 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b0307be-c90c-45a1-bef7-32bf07c7e35e-internal-tls-certs\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.775414 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b0307be-c90c-45a1-bef7-32bf07c7e35e-logs\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.775464 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nql2g\" (UniqueName: \"kubernetes.io/projected/0b0307be-c90c-45a1-bef7-32bf07c7e35e-kube-api-access-nql2g\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.775535 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b0307be-c90c-45a1-bef7-32bf07c7e35e-config-data-custom\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.877831 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b0307be-c90c-45a1-bef7-32bf07c7e35e-combined-ca-bundle\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.878369 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b0307be-c90c-45a1-bef7-32bf07c7e35e-internal-tls-certs\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.878405 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b0307be-c90c-45a1-bef7-32bf07c7e35e-logs\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.878461 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nql2g\" (UniqueName: \"kubernetes.io/projected/0b0307be-c90c-45a1-bef7-32bf07c7e35e-kube-api-access-nql2g\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.878559 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b0307be-c90c-45a1-bef7-32bf07c7e35e-config-data-custom\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.878604 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b0307be-c90c-45a1-bef7-32bf07c7e35e-config-data\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.878668 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b0307be-c90c-45a1-bef7-32bf07c7e35e-public-tls-certs\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.879969 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b0307be-c90c-45a1-bef7-32bf07c7e35e-logs\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.887365 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b0307be-c90c-45a1-bef7-32bf07c7e35e-internal-tls-certs\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.887431 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b0307be-c90c-45a1-bef7-32bf07c7e35e-config-data-custom\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.895258 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b0307be-c90c-45a1-bef7-32bf07c7e35e-config-data\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.896040 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b0307be-c90c-45a1-bef7-32bf07c7e35e-public-tls-certs\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.904478 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nql2g\" (UniqueName: \"kubernetes.io/projected/0b0307be-c90c-45a1-bef7-32bf07c7e35e-kube-api-access-nql2g\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:27 crc kubenswrapper[4952]: I1122 03:11:27.905100 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b0307be-c90c-45a1-bef7-32bf07c7e35e-combined-ca-bundle\") pod \"barbican-api-746b4d95d8-xhmsq\" (UID: \"0b0307be-c90c-45a1-bef7-32bf07c7e35e\") " pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.095004 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.117727 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.184669 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjqnx\" (UniqueName: \"kubernetes.io/projected/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-kube-api-access-wjqnx\") pod \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.184770 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-etc-machine-id\") pod \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.184838 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-scripts\") pod \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.185009 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-config-data-custom\") pod \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.185077 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-combined-ca-bundle\") pod \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.185110 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-config-data\") pod \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.185181 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-logs\") pod \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\" (UID: \"0964838d-cfe7-4e8c-a2ee-56a92b0775b4\") " Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.185349 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0964838d-cfe7-4e8c-a2ee-56a92b0775b4" (UID: "0964838d-cfe7-4e8c-a2ee-56a92b0775b4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.186399 4952 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.186685 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-logs" (OuterVolumeSpecName: "logs") pod "0964838d-cfe7-4e8c-a2ee-56a92b0775b4" (UID: "0964838d-cfe7-4e8c-a2ee-56a92b0775b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.189919 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0964838d-cfe7-4e8c-a2ee-56a92b0775b4" (UID: "0964838d-cfe7-4e8c-a2ee-56a92b0775b4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.190879 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-scripts" (OuterVolumeSpecName: "scripts") pod "0964838d-cfe7-4e8c-a2ee-56a92b0775b4" (UID: "0964838d-cfe7-4e8c-a2ee-56a92b0775b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.190899 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-kube-api-access-wjqnx" (OuterVolumeSpecName: "kube-api-access-wjqnx") pod "0964838d-cfe7-4e8c-a2ee-56a92b0775b4" (UID: "0964838d-cfe7-4e8c-a2ee-56a92b0775b4"). InnerVolumeSpecName "kube-api-access-wjqnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.218589 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0964838d-cfe7-4e8c-a2ee-56a92b0775b4" (UID: "0964838d-cfe7-4e8c-a2ee-56a92b0775b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.231503 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-config-data" (OuterVolumeSpecName: "config-data") pod "0964838d-cfe7-4e8c-a2ee-56a92b0775b4" (UID: "0964838d-cfe7-4e8c-a2ee-56a92b0775b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.289173 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjqnx\" (UniqueName: \"kubernetes.io/projected/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-kube-api-access-wjqnx\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.289211 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.289223 4952 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.289232 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.289244 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.289257 4952 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0964838d-cfe7-4e8c-a2ee-56a92b0775b4-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.431699 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ed45c0be-9309-43ec-a50b-0d6f4169d869","Type":"ContainerStarted","Data":"38fb433df5fe84c0a4dd2171dab78a364d06c18591fde71840c4cdbcf8b428c2"} Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.445425 4952 generic.go:334] "Generic (PLEG): container finished" podID="0964838d-cfe7-4e8c-a2ee-56a92b0775b4" containerID="32e0be4fd3625c986694f752e82fe72c4be2490d5d4185e7ec969c11edb03923" exitCode=0 Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.445455 4952 generic.go:334] "Generic (PLEG): container finished" podID="0964838d-cfe7-4e8c-a2ee-56a92b0775b4" containerID="5efc3c0a441a05ad7941acf45c4376e3ca1396b3bfee0f08be8d2bf6c074a0c4" exitCode=143 Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.446731 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.449767 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0964838d-cfe7-4e8c-a2ee-56a92b0775b4","Type":"ContainerDied","Data":"32e0be4fd3625c986694f752e82fe72c4be2490d5d4185e7ec969c11edb03923"} Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.449860 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0964838d-cfe7-4e8c-a2ee-56a92b0775b4","Type":"ContainerDied","Data":"5efc3c0a441a05ad7941acf45c4376e3ca1396b3bfee0f08be8d2bf6c074a0c4"} Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.449876 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0964838d-cfe7-4e8c-a2ee-56a92b0775b4","Type":"ContainerDied","Data":"7d82e69b7a9692a6b105f96eb4111acadd26fc417a3e3347fb5bf42b8d766f1e"} Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.449902 4952 scope.go:117] "RemoveContainer" containerID="32e0be4fd3625c986694f752e82fe72c4be2490d5d4185e7ec969c11edb03923" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.461157 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.532409164 podStartE2EDuration="6.46113582s" podCreationTimestamp="2025-11-22 03:11:22 +0000 UTC" firstStartedPulling="2025-11-22 03:11:25.039326001 +0000 UTC m=+1049.345343274" lastFinishedPulling="2025-11-22 03:11:25.968052657 +0000 UTC m=+1050.274069930" observedRunningTime="2025-11-22 03:11:28.458779786 +0000 UTC m=+1052.764797079" watchObservedRunningTime="2025-11-22 03:11:28.46113582 +0000 UTC m=+1052.767153093" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.511388 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.529536 4952 scope.go:117] "RemoveContainer" containerID="5efc3c0a441a05ad7941acf45c4376e3ca1396b3bfee0f08be8d2bf6c074a0c4" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.548345 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.552711 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 22 03:11:28 crc kubenswrapper[4952]: E1122 03:11:28.553057 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0964838d-cfe7-4e8c-a2ee-56a92b0775b4" containerName="cinder-api-log" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.553077 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="0964838d-cfe7-4e8c-a2ee-56a92b0775b4" containerName="cinder-api-log" Nov 22 03:11:28 crc kubenswrapper[4952]: E1122 03:11:28.553110 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0964838d-cfe7-4e8c-a2ee-56a92b0775b4" containerName="cinder-api" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.553116 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="0964838d-cfe7-4e8c-a2ee-56a92b0775b4" containerName="cinder-api" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.553426 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="0964838d-cfe7-4e8c-a2ee-56a92b0775b4" containerName="cinder-api-log" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.553437 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="0964838d-cfe7-4e8c-a2ee-56a92b0775b4" containerName="cinder-api" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.554506 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.560976 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.561677 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.561835 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.568619 4952 scope.go:117] "RemoveContainer" containerID="32e0be4fd3625c986694f752e82fe72c4be2490d5d4185e7ec969c11edb03923" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.569253 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 03:11:28 crc kubenswrapper[4952]: E1122 03:11:28.576199 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32e0be4fd3625c986694f752e82fe72c4be2490d5d4185e7ec969c11edb03923\": container with ID starting with 32e0be4fd3625c986694f752e82fe72c4be2490d5d4185e7ec969c11edb03923 not found: ID does not exist" containerID="32e0be4fd3625c986694f752e82fe72c4be2490d5d4185e7ec969c11edb03923" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.576256 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32e0be4fd3625c986694f752e82fe72c4be2490d5d4185e7ec969c11edb03923"} err="failed to get container status \"32e0be4fd3625c986694f752e82fe72c4be2490d5d4185e7ec969c11edb03923\": rpc error: code = NotFound desc = could not find container \"32e0be4fd3625c986694f752e82fe72c4be2490d5d4185e7ec969c11edb03923\": container with ID starting with 32e0be4fd3625c986694f752e82fe72c4be2490d5d4185e7ec969c11edb03923 not found: ID does not exist" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.576285 4952 scope.go:117] "RemoveContainer" containerID="5efc3c0a441a05ad7941acf45c4376e3ca1396b3bfee0f08be8d2bf6c074a0c4" Nov 22 03:11:28 crc kubenswrapper[4952]: E1122 03:11:28.577142 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5efc3c0a441a05ad7941acf45c4376e3ca1396b3bfee0f08be8d2bf6c074a0c4\": container with ID starting with 5efc3c0a441a05ad7941acf45c4376e3ca1396b3bfee0f08be8d2bf6c074a0c4 not found: ID does not exist" containerID="5efc3c0a441a05ad7941acf45c4376e3ca1396b3bfee0f08be8d2bf6c074a0c4" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.577168 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5efc3c0a441a05ad7941acf45c4376e3ca1396b3bfee0f08be8d2bf6c074a0c4"} err="failed to get container status \"5efc3c0a441a05ad7941acf45c4376e3ca1396b3bfee0f08be8d2bf6c074a0c4\": rpc error: code = NotFound desc = could not find container \"5efc3c0a441a05ad7941acf45c4376e3ca1396b3bfee0f08be8d2bf6c074a0c4\": container with ID starting with 5efc3c0a441a05ad7941acf45c4376e3ca1396b3bfee0f08be8d2bf6c074a0c4 not found: ID does not exist" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.577182 4952 scope.go:117] "RemoveContainer" containerID="32e0be4fd3625c986694f752e82fe72c4be2490d5d4185e7ec969c11edb03923" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.577874 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32e0be4fd3625c986694f752e82fe72c4be2490d5d4185e7ec969c11edb03923"} err="failed to get container status \"32e0be4fd3625c986694f752e82fe72c4be2490d5d4185e7ec969c11edb03923\": rpc error: code = NotFound desc = could not find container \"32e0be4fd3625c986694f752e82fe72c4be2490d5d4185e7ec969c11edb03923\": container with ID starting with 32e0be4fd3625c986694f752e82fe72c4be2490d5d4185e7ec969c11edb03923 not found: ID does not exist" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.577933 4952 scope.go:117] "RemoveContainer" containerID="5efc3c0a441a05ad7941acf45c4376e3ca1396b3bfee0f08be8d2bf6c074a0c4" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.578633 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5efc3c0a441a05ad7941acf45c4376e3ca1396b3bfee0f08be8d2bf6c074a0c4"} err="failed to get container status \"5efc3c0a441a05ad7941acf45c4376e3ca1396b3bfee0f08be8d2bf6c074a0c4\": rpc error: code = NotFound desc = could not find container \"5efc3c0a441a05ad7941acf45c4376e3ca1396b3bfee0f08be8d2bf6c074a0c4\": container with ID starting with 5efc3c0a441a05ad7941acf45c4376e3ca1396b3bfee0f08be8d2bf6c074a0c4 not found: ID does not exist" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.664776 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-746b4d95d8-xhmsq"] Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.698556 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsvrr\" (UniqueName: \"kubernetes.io/projected/19fdfd30-4e69-4d96-951b-4daa193985e9-kube-api-access-jsvrr\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.701463 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19fdfd30-4e69-4d96-951b-4daa193985e9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.702586 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19fdfd30-4e69-4d96-951b-4daa193985e9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.702644 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19fdfd30-4e69-4d96-951b-4daa193985e9-config-data-custom\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.703090 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fdfd30-4e69-4d96-951b-4daa193985e9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.703183 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19fdfd30-4e69-4d96-951b-4daa193985e9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.703274 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19fdfd30-4e69-4d96-951b-4daa193985e9-logs\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.703393 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19fdfd30-4e69-4d96-951b-4daa193985e9-scripts\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.703448 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fdfd30-4e69-4d96-951b-4daa193985e9-config-data\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.805258 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19fdfd30-4e69-4d96-951b-4daa193985e9-logs\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.808994 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19fdfd30-4e69-4d96-951b-4daa193985e9-scripts\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.809164 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fdfd30-4e69-4d96-951b-4daa193985e9-config-data\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.809354 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsvrr\" (UniqueName: \"kubernetes.io/projected/19fdfd30-4e69-4d96-951b-4daa193985e9-kube-api-access-jsvrr\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.809614 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19fdfd30-4e69-4d96-951b-4daa193985e9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.809754 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19fdfd30-4e69-4d96-951b-4daa193985e9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.809854 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19fdfd30-4e69-4d96-951b-4daa193985e9-config-data-custom\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.810105 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fdfd30-4e69-4d96-951b-4daa193985e9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.810270 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19fdfd30-4e69-4d96-951b-4daa193985e9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.808497 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19fdfd30-4e69-4d96-951b-4daa193985e9-logs\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.811930 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19fdfd30-4e69-4d96-951b-4daa193985e9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.817014 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19fdfd30-4e69-4d96-951b-4daa193985e9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.817608 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19fdfd30-4e69-4d96-951b-4daa193985e9-config-data-custom\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.818942 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19fdfd30-4e69-4d96-951b-4daa193985e9-scripts\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.819067 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fdfd30-4e69-4d96-951b-4daa193985e9-config-data\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.819486 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fdfd30-4e69-4d96-951b-4daa193985e9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.820894 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19fdfd30-4e69-4d96-951b-4daa193985e9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.826637 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsvrr\" (UniqueName: \"kubernetes.io/projected/19fdfd30-4e69-4d96-951b-4daa193985e9-kube-api-access-jsvrr\") pod \"cinder-api-0\" (UID: \"19fdfd30-4e69-4d96-951b-4daa193985e9\") " pod="openstack/cinder-api-0" Nov 22 03:11:28 crc kubenswrapper[4952]: I1122 03:11:28.883352 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 03:11:29 crc kubenswrapper[4952]: I1122 03:11:29.448562 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 03:11:29 crc kubenswrapper[4952]: W1122 03:11:29.456991 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19fdfd30_4e69_4d96_951b_4daa193985e9.slice/crio-d7f68b80704d4e6b2211606766b5c0c24e39d2b4c4da2d4262a46a5444b2fa6f WatchSource:0}: Error finding container d7f68b80704d4e6b2211606766b5c0c24e39d2b4c4da2d4262a46a5444b2fa6f: Status 404 returned error can't find the container with id d7f68b80704d4e6b2211606766b5c0c24e39d2b4c4da2d4262a46a5444b2fa6f Nov 22 03:11:29 crc kubenswrapper[4952]: I1122 03:11:29.462240 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-746b4d95d8-xhmsq" event={"ID":"0b0307be-c90c-45a1-bef7-32bf07c7e35e","Type":"ContainerStarted","Data":"586e3715dbebef777735036a645d142a9d2c6cab71277606a429d234f2bea98b"} Nov 22 03:11:29 crc kubenswrapper[4952]: I1122 03:11:29.462299 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-746b4d95d8-xhmsq" event={"ID":"0b0307be-c90c-45a1-bef7-32bf07c7e35e","Type":"ContainerStarted","Data":"ffb3f27b363cccef5ca0750c8b9db6df4625d4c9222ab72ddeaf9c65842025df"} Nov 22 03:11:29 crc kubenswrapper[4952]: I1122 03:11:29.462311 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-746b4d95d8-xhmsq" event={"ID":"0b0307be-c90c-45a1-bef7-32bf07c7e35e","Type":"ContainerStarted","Data":"548541302b8abd259ccf29bac4f65ed9bc0a7b3aecfc5fb8393a763a1c80c170"} Nov 22 03:11:29 crc kubenswrapper[4952]: I1122 03:11:29.462739 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:29 crc kubenswrapper[4952]: I1122 03:11:29.463113 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:29 crc kubenswrapper[4952]: I1122 03:11:29.485142 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-746b4d95d8-xhmsq" podStartSLOduration=2.485119284 podStartE2EDuration="2.485119284s" podCreationTimestamp="2025-11-22 03:11:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:29.483987913 +0000 UTC m=+1053.790005186" watchObservedRunningTime="2025-11-22 03:11:29.485119284 +0000 UTC m=+1053.791136557" Nov 22 03:11:30 crc kubenswrapper[4952]: I1122 03:11:30.480940 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"19fdfd30-4e69-4d96-951b-4daa193985e9","Type":"ContainerStarted","Data":"9ad77ebd583babc6b7884750963bfebcf0bbc51eba510f270fbe2dbcf2331029"} Nov 22 03:11:30 crc kubenswrapper[4952]: I1122 03:11:30.481513 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"19fdfd30-4e69-4d96-951b-4daa193985e9","Type":"ContainerStarted","Data":"d7f68b80704d4e6b2211606766b5c0c24e39d2b4c4da2d4262a46a5444b2fa6f"} Nov 22 03:11:30 crc kubenswrapper[4952]: I1122 03:11:30.547063 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0964838d-cfe7-4e8c-a2ee-56a92b0775b4" path="/var/lib/kubelet/pods/0964838d-cfe7-4e8c-a2ee-56a92b0775b4/volumes" Nov 22 03:11:31 crc kubenswrapper[4952]: I1122 03:11:31.499906 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"19fdfd30-4e69-4d96-951b-4daa193985e9","Type":"ContainerStarted","Data":"9e3ab1d5fa87df0a0127657efa5f9daec5d36b0fdb4f87634f562dca54e50a71"} Nov 22 03:11:31 crc kubenswrapper[4952]: I1122 03:11:31.500540 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 22 03:11:31 crc kubenswrapper[4952]: I1122 03:11:31.530201 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.530177213 podStartE2EDuration="3.530177213s" podCreationTimestamp="2025-11-22 03:11:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:31.528576919 +0000 UTC m=+1055.834594182" watchObservedRunningTime="2025-11-22 03:11:31.530177213 +0000 UTC m=+1055.836194486" Nov 22 03:11:32 crc kubenswrapper[4952]: I1122 03:11:32.629299 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:32 crc kubenswrapper[4952]: I1122 03:11:32.735624 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:32 crc kubenswrapper[4952]: I1122 03:11:32.792211 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 22 03:11:32 crc kubenswrapper[4952]: I1122 03:11:32.964830 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.068763 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-fbkhc"] Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.069173 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" podUID="9a7f108a-a73c-40d7-8aa0-4050c7819915" containerName="dnsmasq-dns" containerID="cri-o://409519a594a484c997f443e2b43a0045738b830d193110da9bd138c3ae5ca1aa" gracePeriod=10 Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.075998 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.552942 4952 generic.go:334] "Generic (PLEG): container finished" podID="9a7f108a-a73c-40d7-8aa0-4050c7819915" containerID="409519a594a484c997f443e2b43a0045738b830d193110da9bd138c3ae5ca1aa" exitCode=0 Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.554078 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" event={"ID":"9a7f108a-a73c-40d7-8aa0-4050c7819915","Type":"ContainerDied","Data":"409519a594a484c997f443e2b43a0045738b830d193110da9bd138c3ae5ca1aa"} Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.653606 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.688599 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.858214 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-ovsdbserver-sb\") pod \"9a7f108a-a73c-40d7-8aa0-4050c7819915\" (UID: \"9a7f108a-a73c-40d7-8aa0-4050c7819915\") " Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.858290 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnbpt\" (UniqueName: \"kubernetes.io/projected/9a7f108a-a73c-40d7-8aa0-4050c7819915-kube-api-access-dnbpt\") pod \"9a7f108a-a73c-40d7-8aa0-4050c7819915\" (UID: \"9a7f108a-a73c-40d7-8aa0-4050c7819915\") " Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.858447 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-config\") pod \"9a7f108a-a73c-40d7-8aa0-4050c7819915\" (UID: \"9a7f108a-a73c-40d7-8aa0-4050c7819915\") " Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.858649 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-dns-svc\") pod \"9a7f108a-a73c-40d7-8aa0-4050c7819915\" (UID: \"9a7f108a-a73c-40d7-8aa0-4050c7819915\") " Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.859033 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-ovsdbserver-nb\") pod \"9a7f108a-a73c-40d7-8aa0-4050c7819915\" (UID: \"9a7f108a-a73c-40d7-8aa0-4050c7819915\") " Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.895766 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a7f108a-a73c-40d7-8aa0-4050c7819915-kube-api-access-dnbpt" (OuterVolumeSpecName: "kube-api-access-dnbpt") pod "9a7f108a-a73c-40d7-8aa0-4050c7819915" (UID: "9a7f108a-a73c-40d7-8aa0-4050c7819915"). InnerVolumeSpecName "kube-api-access-dnbpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.929341 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9a7f108a-a73c-40d7-8aa0-4050c7819915" (UID: "9a7f108a-a73c-40d7-8aa0-4050c7819915"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.930746 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9a7f108a-a73c-40d7-8aa0-4050c7819915" (UID: "9a7f108a-a73c-40d7-8aa0-4050c7819915"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.932990 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-config" (OuterVolumeSpecName: "config") pod "9a7f108a-a73c-40d7-8aa0-4050c7819915" (UID: "9a7f108a-a73c-40d7-8aa0-4050c7819915"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.964137 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.964357 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnbpt\" (UniqueName: \"kubernetes.io/projected/9a7f108a-a73c-40d7-8aa0-4050c7819915-kube-api-access-dnbpt\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.964421 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.964496 4952 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:33 crc kubenswrapper[4952]: I1122 03:11:33.973815 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9a7f108a-a73c-40d7-8aa0-4050c7819915" (UID: "9a7f108a-a73c-40d7-8aa0-4050c7819915"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:34 crc kubenswrapper[4952]: I1122 03:11:34.067604 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a7f108a-a73c-40d7-8aa0-4050c7819915-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:34 crc kubenswrapper[4952]: I1122 03:11:34.564240 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" event={"ID":"9a7f108a-a73c-40d7-8aa0-4050c7819915","Type":"ContainerDied","Data":"176792cbe0ff28c007c471f38f36b2dede1bd03ac35d39cae2a7f665438b4b0e"} Nov 22 03:11:34 crc kubenswrapper[4952]: I1122 03:11:34.564323 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-fbkhc" Nov 22 03:11:34 crc kubenswrapper[4952]: I1122 03:11:34.564879 4952 scope.go:117] "RemoveContainer" containerID="409519a594a484c997f443e2b43a0045738b830d193110da9bd138c3ae5ca1aa" Nov 22 03:11:34 crc kubenswrapper[4952]: I1122 03:11:34.564420 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ed45c0be-9309-43ec-a50b-0d6f4169d869" containerName="cinder-scheduler" containerID="cri-o://eae47ed545e0b70c4a8d0c0bfa87c081d539bc6d4fb1d419fecef56dffaf4d4a" gracePeriod=30 Nov 22 03:11:34 crc kubenswrapper[4952]: I1122 03:11:34.564429 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ed45c0be-9309-43ec-a50b-0d6f4169d869" containerName="probe" containerID="cri-o://38fb433df5fe84c0a4dd2171dab78a364d06c18591fde71840c4cdbcf8b428c2" gracePeriod=30 Nov 22 03:11:34 crc kubenswrapper[4952]: I1122 03:11:34.592759 4952 scope.go:117] "RemoveContainer" containerID="6e7b184d50d8ea796698026b12dd584c56aecfeec5d780741a6a448de04556eb" Nov 22 03:11:34 crc kubenswrapper[4952]: I1122 03:11:34.607255 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-fbkhc"] Nov 22 03:11:34 crc kubenswrapper[4952]: I1122 03:11:34.622956 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-fbkhc"] Nov 22 03:11:34 crc kubenswrapper[4952]: I1122 03:11:34.706302 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:34 crc kubenswrapper[4952]: I1122 03:11:34.769193 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:35 crc kubenswrapper[4952]: I1122 03:11:35.581612 4952 generic.go:334] "Generic (PLEG): container finished" podID="ed45c0be-9309-43ec-a50b-0d6f4169d869" containerID="38fb433df5fe84c0a4dd2171dab78a364d06c18591fde71840c4cdbcf8b428c2" exitCode=0 Nov 22 03:11:35 crc kubenswrapper[4952]: I1122 03:11:35.582175 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ed45c0be-9309-43ec-a50b-0d6f4169d869","Type":"ContainerDied","Data":"38fb433df5fe84c0a4dd2171dab78a364d06c18591fde71840c4cdbcf8b428c2"} Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.156295 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.160072 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-65f67fb964-vlq9s" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.189838 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-64f5d57fc8-zqkpt" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.564259 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a7f108a-a73c-40d7-8aa0-4050c7819915" path="/var/lib/kubelet/pods/9a7f108a-a73c-40d7-8aa0-4050c7819915/volumes" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.738253 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 22 03:11:36 crc kubenswrapper[4952]: E1122 03:11:36.738723 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a7f108a-a73c-40d7-8aa0-4050c7819915" containerName="init" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.738756 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a7f108a-a73c-40d7-8aa0-4050c7819915" containerName="init" Nov 22 03:11:36 crc kubenswrapper[4952]: E1122 03:11:36.738778 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a7f108a-a73c-40d7-8aa0-4050c7819915" containerName="dnsmasq-dns" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.738787 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a7f108a-a73c-40d7-8aa0-4050c7819915" containerName="dnsmasq-dns" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.738994 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a7f108a-a73c-40d7-8aa0-4050c7819915" containerName="dnsmasq-dns" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.739748 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.744847 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.745325 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.749149 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-kcdlj" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.749978 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.780918 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-746b4d95d8-xhmsq" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.832217 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxd8d\" (UniqueName: \"kubernetes.io/projected/4768e499-4f1e-4a22-9e20-b05cf83f1c89-kube-api-access-rxd8d\") pod \"openstackclient\" (UID: \"4768e499-4f1e-4a22-9e20-b05cf83f1c89\") " pod="openstack/openstackclient" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.832372 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4768e499-4f1e-4a22-9e20-b05cf83f1c89-openstack-config-secret\") pod \"openstackclient\" (UID: \"4768e499-4f1e-4a22-9e20-b05cf83f1c89\") " pod="openstack/openstackclient" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.832770 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4768e499-4f1e-4a22-9e20-b05cf83f1c89-openstack-config\") pod \"openstackclient\" (UID: \"4768e499-4f1e-4a22-9e20-b05cf83f1c89\") " pod="openstack/openstackclient" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.832841 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4768e499-4f1e-4a22-9e20-b05cf83f1c89-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4768e499-4f1e-4a22-9e20-b05cf83f1c89\") " pod="openstack/openstackclient" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.861964 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5897f54458-5r5nc"] Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.862239 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5897f54458-5r5nc" podUID="16fdf15b-0bbd-440a-835b-0ef98ef6b28d" containerName="barbican-api-log" containerID="cri-o://4b2b791268158fda57c42dd867cd67f2920b83cbe46877eb93841f19645345c2" gracePeriod=30 Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.862410 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5897f54458-5r5nc" podUID="16fdf15b-0bbd-440a-835b-0ef98ef6b28d" containerName="barbican-api" containerID="cri-o://0a2dfd5a8c288721fb08a0bee8d534b05b0460ed66ad06257a8dec074ed88e3f" gracePeriod=30 Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.937822 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4768e499-4f1e-4a22-9e20-b05cf83f1c89-openstack-config-secret\") pod \"openstackclient\" (UID: \"4768e499-4f1e-4a22-9e20-b05cf83f1c89\") " pod="openstack/openstackclient" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.937963 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4768e499-4f1e-4a22-9e20-b05cf83f1c89-openstack-config\") pod \"openstackclient\" (UID: \"4768e499-4f1e-4a22-9e20-b05cf83f1c89\") " pod="openstack/openstackclient" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.937991 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4768e499-4f1e-4a22-9e20-b05cf83f1c89-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4768e499-4f1e-4a22-9e20-b05cf83f1c89\") " pod="openstack/openstackclient" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.938024 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxd8d\" (UniqueName: \"kubernetes.io/projected/4768e499-4f1e-4a22-9e20-b05cf83f1c89-kube-api-access-rxd8d\") pod \"openstackclient\" (UID: \"4768e499-4f1e-4a22-9e20-b05cf83f1c89\") " pod="openstack/openstackclient" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.943572 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4768e499-4f1e-4a22-9e20-b05cf83f1c89-openstack-config\") pod \"openstackclient\" (UID: \"4768e499-4f1e-4a22-9e20-b05cf83f1c89\") " pod="openstack/openstackclient" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.955315 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-85576cd755-lg8j8" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.955533 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4768e499-4f1e-4a22-9e20-b05cf83f1c89-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4768e499-4f1e-4a22-9e20-b05cf83f1c89\") " pod="openstack/openstackclient" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.963833 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxd8d\" (UniqueName: \"kubernetes.io/projected/4768e499-4f1e-4a22-9e20-b05cf83f1c89-kube-api-access-rxd8d\") pod \"openstackclient\" (UID: \"4768e499-4f1e-4a22-9e20-b05cf83f1c89\") " pod="openstack/openstackclient" Nov 22 03:11:36 crc kubenswrapper[4952]: I1122 03:11:36.974981 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4768e499-4f1e-4a22-9e20-b05cf83f1c89-openstack-config-secret\") pod \"openstackclient\" (UID: \"4768e499-4f1e-4a22-9e20-b05cf83f1c89\") " pod="openstack/openstackclient" Nov 22 03:11:37 crc kubenswrapper[4952]: I1122 03:11:37.030732 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-856965f45b-8d4qq"] Nov 22 03:11:37 crc kubenswrapper[4952]: I1122 03:11:37.031017 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-856965f45b-8d4qq" podUID="d3701508-1734-46b5-83f3-9f08f930b294" containerName="neutron-api" containerID="cri-o://e9bd255dde565b42faf43b862902b2902ef36a02022be70b9acc3b15327ddb3f" gracePeriod=30 Nov 22 03:11:37 crc kubenswrapper[4952]: I1122 03:11:37.031320 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-856965f45b-8d4qq" podUID="d3701508-1734-46b5-83f3-9f08f930b294" containerName="neutron-httpd" containerID="cri-o://de75c61bcfa98d6971fb7470dfcdbd2a151952e28b151207f88afbfd957018f8" gracePeriod=30 Nov 22 03:11:37 crc kubenswrapper[4952]: I1122 03:11:37.059438 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 03:11:37 crc kubenswrapper[4952]: I1122 03:11:37.597775 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 22 03:11:37 crc kubenswrapper[4952]: I1122 03:11:37.630579 4952 generic.go:334] "Generic (PLEG): container finished" podID="16fdf15b-0bbd-440a-835b-0ef98ef6b28d" containerID="4b2b791268158fda57c42dd867cd67f2920b83cbe46877eb93841f19645345c2" exitCode=143 Nov 22 03:11:37 crc kubenswrapper[4952]: I1122 03:11:37.630751 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897f54458-5r5nc" event={"ID":"16fdf15b-0bbd-440a-835b-0ef98ef6b28d","Type":"ContainerDied","Data":"4b2b791268158fda57c42dd867cd67f2920b83cbe46877eb93841f19645345c2"} Nov 22 03:11:37 crc kubenswrapper[4952]: I1122 03:11:37.636054 4952 generic.go:334] "Generic (PLEG): container finished" podID="d3701508-1734-46b5-83f3-9f08f930b294" containerID="de75c61bcfa98d6971fb7470dfcdbd2a151952e28b151207f88afbfd957018f8" exitCode=0 Nov 22 03:11:37 crc kubenswrapper[4952]: I1122 03:11:37.636107 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-856965f45b-8d4qq" event={"ID":"d3701508-1734-46b5-83f3-9f08f930b294","Type":"ContainerDied","Data":"de75c61bcfa98d6971fb7470dfcdbd2a151952e28b151207f88afbfd957018f8"} Nov 22 03:11:38 crc kubenswrapper[4952]: I1122 03:11:38.657758 4952 generic.go:334] "Generic (PLEG): container finished" podID="ed45c0be-9309-43ec-a50b-0d6f4169d869" containerID="eae47ed545e0b70c4a8d0c0bfa87c081d539bc6d4fb1d419fecef56dffaf4d4a" exitCode=0 Nov 22 03:11:38 crc kubenswrapper[4952]: I1122 03:11:38.657842 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ed45c0be-9309-43ec-a50b-0d6f4169d869","Type":"ContainerDied","Data":"eae47ed545e0b70c4a8d0c0bfa87c081d539bc6d4fb1d419fecef56dffaf4d4a"} Nov 22 03:11:38 crc kubenswrapper[4952]: I1122 03:11:38.660421 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4768e499-4f1e-4a22-9e20-b05cf83f1c89","Type":"ContainerStarted","Data":"7c84a77058510e3f7145018f7252484442b446c62f7da150f8f21f5b4741594e"} Nov 22 03:11:38 crc kubenswrapper[4952]: I1122 03:11:38.995382 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.112921 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-config-data-custom\") pod \"ed45c0be-9309-43ec-a50b-0d6f4169d869\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.113508 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-config-data\") pod \"ed45c0be-9309-43ec-a50b-0d6f4169d869\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.113610 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-combined-ca-bundle\") pod \"ed45c0be-9309-43ec-a50b-0d6f4169d869\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.113697 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-scripts\") pod \"ed45c0be-9309-43ec-a50b-0d6f4169d869\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.113826 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr7kd\" (UniqueName: \"kubernetes.io/projected/ed45c0be-9309-43ec-a50b-0d6f4169d869-kube-api-access-dr7kd\") pod \"ed45c0be-9309-43ec-a50b-0d6f4169d869\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.114014 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ed45c0be-9309-43ec-a50b-0d6f4169d869-etc-machine-id\") pod \"ed45c0be-9309-43ec-a50b-0d6f4169d869\" (UID: \"ed45c0be-9309-43ec-a50b-0d6f4169d869\") " Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.114596 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed45c0be-9309-43ec-a50b-0d6f4169d869-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ed45c0be-9309-43ec-a50b-0d6f4169d869" (UID: "ed45c0be-9309-43ec-a50b-0d6f4169d869"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.128612 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ed45c0be-9309-43ec-a50b-0d6f4169d869" (UID: "ed45c0be-9309-43ec-a50b-0d6f4169d869"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.129213 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed45c0be-9309-43ec-a50b-0d6f4169d869-kube-api-access-dr7kd" (OuterVolumeSpecName: "kube-api-access-dr7kd") pod "ed45c0be-9309-43ec-a50b-0d6f4169d869" (UID: "ed45c0be-9309-43ec-a50b-0d6f4169d869"). InnerVolumeSpecName "kube-api-access-dr7kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.136707 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-scripts" (OuterVolumeSpecName: "scripts") pod "ed45c0be-9309-43ec-a50b-0d6f4169d869" (UID: "ed45c0be-9309-43ec-a50b-0d6f4169d869"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.208775 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed45c0be-9309-43ec-a50b-0d6f4169d869" (UID: "ed45c0be-9309-43ec-a50b-0d6f4169d869"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.217021 4952 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.217084 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.217096 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.217106 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr7kd\" (UniqueName: \"kubernetes.io/projected/ed45c0be-9309-43ec-a50b-0d6f4169d869-kube-api-access-dr7kd\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.217120 4952 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ed45c0be-9309-43ec-a50b-0d6f4169d869-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.275955 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-config-data" (OuterVolumeSpecName: "config-data") pod "ed45c0be-9309-43ec-a50b-0d6f4169d869" (UID: "ed45c0be-9309-43ec-a50b-0d6f4169d869"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.319224 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed45c0be-9309-43ec-a50b-0d6f4169d869-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.693289 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ed45c0be-9309-43ec-a50b-0d6f4169d869","Type":"ContainerDied","Data":"183b9d9cc912248ec1f7de9de30b6c294ed8df21f110c8eaa2f5bf33b6e684cc"} Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.693352 4952 scope.go:117] "RemoveContainer" containerID="38fb433df5fe84c0a4dd2171dab78a364d06c18591fde71840c4cdbcf8b428c2" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.693412 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.728401 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.750879 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.781014 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 03:11:39 crc kubenswrapper[4952]: E1122 03:11:39.781596 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed45c0be-9309-43ec-a50b-0d6f4169d869" containerName="probe" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.781611 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed45c0be-9309-43ec-a50b-0d6f4169d869" containerName="probe" Nov 22 03:11:39 crc kubenswrapper[4952]: E1122 03:11:39.781666 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed45c0be-9309-43ec-a50b-0d6f4169d869" containerName="cinder-scheduler" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.781674 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed45c0be-9309-43ec-a50b-0d6f4169d869" containerName="cinder-scheduler" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.781857 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed45c0be-9309-43ec-a50b-0d6f4169d869" containerName="probe" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.781888 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed45c0be-9309-43ec-a50b-0d6f4169d869" containerName="cinder-scheduler" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.783309 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.794024 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.794702 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.805995 4952 scope.go:117] "RemoveContainer" containerID="eae47ed545e0b70c4a8d0c0bfa87c081d539bc6d4fb1d419fecef56dffaf4d4a" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.943929 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxslm\" (UniqueName: \"kubernetes.io/projected/2ff51385-a462-478d-bd61-62d15d7c5c41-kube-api-access-hxslm\") pod \"cinder-scheduler-0\" (UID: \"2ff51385-a462-478d-bd61-62d15d7c5c41\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.944528 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ff51385-a462-478d-bd61-62d15d7c5c41-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2ff51385-a462-478d-bd61-62d15d7c5c41\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.944581 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff51385-a462-478d-bd61-62d15d7c5c41-config-data\") pod \"cinder-scheduler-0\" (UID: \"2ff51385-a462-478d-bd61-62d15d7c5c41\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.944767 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ff51385-a462-478d-bd61-62d15d7c5c41-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2ff51385-a462-478d-bd61-62d15d7c5c41\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.944907 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff51385-a462-478d-bd61-62d15d7c5c41-scripts\") pod \"cinder-scheduler-0\" (UID: \"2ff51385-a462-478d-bd61-62d15d7c5c41\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:39 crc kubenswrapper[4952]: I1122 03:11:39.945001 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff51385-a462-478d-bd61-62d15d7c5c41-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2ff51385-a462-478d-bd61-62d15d7c5c41\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.047385 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff51385-a462-478d-bd61-62d15d7c5c41-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2ff51385-a462-478d-bd61-62d15d7c5c41\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.047532 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxslm\" (UniqueName: \"kubernetes.io/projected/2ff51385-a462-478d-bd61-62d15d7c5c41-kube-api-access-hxslm\") pod \"cinder-scheduler-0\" (UID: \"2ff51385-a462-478d-bd61-62d15d7c5c41\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.047623 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ff51385-a462-478d-bd61-62d15d7c5c41-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2ff51385-a462-478d-bd61-62d15d7c5c41\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.047650 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff51385-a462-478d-bd61-62d15d7c5c41-config-data\") pod \"cinder-scheduler-0\" (UID: \"2ff51385-a462-478d-bd61-62d15d7c5c41\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.047690 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ff51385-a462-478d-bd61-62d15d7c5c41-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2ff51385-a462-478d-bd61-62d15d7c5c41\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.047721 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff51385-a462-478d-bd61-62d15d7c5c41-scripts\") pod \"cinder-scheduler-0\" (UID: \"2ff51385-a462-478d-bd61-62d15d7c5c41\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.051324 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ff51385-a462-478d-bd61-62d15d7c5c41-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2ff51385-a462-478d-bd61-62d15d7c5c41\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.055666 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff51385-a462-478d-bd61-62d15d7c5c41-scripts\") pod \"cinder-scheduler-0\" (UID: \"2ff51385-a462-478d-bd61-62d15d7c5c41\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.056231 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ff51385-a462-478d-bd61-62d15d7c5c41-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2ff51385-a462-478d-bd61-62d15d7c5c41\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.057505 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff51385-a462-478d-bd61-62d15d7c5c41-config-data\") pod \"cinder-scheduler-0\" (UID: \"2ff51385-a462-478d-bd61-62d15d7c5c41\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.064263 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff51385-a462-478d-bd61-62d15d7c5c41-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2ff51385-a462-478d-bd61-62d15d7c5c41\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.069532 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxslm\" (UniqueName: \"kubernetes.io/projected/2ff51385-a462-478d-bd61-62d15d7c5c41-kube-api-access-hxslm\") pod \"cinder-scheduler-0\" (UID: \"2ff51385-a462-478d-bd61-62d15d7c5c41\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.144637 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.565492 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed45c0be-9309-43ec-a50b-0d6f4169d869" path="/var/lib/kubelet/pods/ed45c0be-9309-43ec-a50b-0d6f4169d869/volumes" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.697186 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.747260 4952 generic.go:334] "Generic (PLEG): container finished" podID="16fdf15b-0bbd-440a-835b-0ef98ef6b28d" containerID="0a2dfd5a8c288721fb08a0bee8d534b05b0460ed66ad06257a8dec074ed88e3f" exitCode=0 Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.747559 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897f54458-5r5nc" event={"ID":"16fdf15b-0bbd-440a-835b-0ef98ef6b28d","Type":"ContainerDied","Data":"0a2dfd5a8c288721fb08a0bee8d534b05b0460ed66ad06257a8dec074ed88e3f"} Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.747703 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897f54458-5r5nc" event={"ID":"16fdf15b-0bbd-440a-835b-0ef98ef6b28d","Type":"ContainerDied","Data":"72c0832aa3513d79261c762aad654265562c78afe4e99ad876d363929a2f9b1e"} Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.747775 4952 scope.go:117] "RemoveContainer" containerID="0a2dfd5a8c288721fb08a0bee8d534b05b0460ed66ad06257a8dec074ed88e3f" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.747986 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5897f54458-5r5nc" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.810266 4952 scope.go:117] "RemoveContainer" containerID="4b2b791268158fda57c42dd867cd67f2920b83cbe46877eb93841f19645345c2" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.872700 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22m9l\" (UniqueName: \"kubernetes.io/projected/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-kube-api-access-22m9l\") pod \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\" (UID: \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\") " Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.873169 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-logs\") pod \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\" (UID: \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\") " Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.873427 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-combined-ca-bundle\") pod \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\" (UID: \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\") " Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.873478 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-config-data-custom\") pod \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\" (UID: \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\") " Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.873522 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-config-data\") pod \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\" (UID: \"16fdf15b-0bbd-440a-835b-0ef98ef6b28d\") " Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.872709 4952 scope.go:117] "RemoveContainer" containerID="0a2dfd5a8c288721fb08a0bee8d534b05b0460ed66ad06257a8dec074ed88e3f" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.874668 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-logs" (OuterVolumeSpecName: "logs") pod "16fdf15b-0bbd-440a-835b-0ef98ef6b28d" (UID: "16fdf15b-0bbd-440a-835b-0ef98ef6b28d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.875908 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 03:11:40 crc kubenswrapper[4952]: E1122 03:11:40.878094 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a2dfd5a8c288721fb08a0bee8d534b05b0460ed66ad06257a8dec074ed88e3f\": container with ID starting with 0a2dfd5a8c288721fb08a0bee8d534b05b0460ed66ad06257a8dec074ed88e3f not found: ID does not exist" containerID="0a2dfd5a8c288721fb08a0bee8d534b05b0460ed66ad06257a8dec074ed88e3f" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.878153 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2dfd5a8c288721fb08a0bee8d534b05b0460ed66ad06257a8dec074ed88e3f"} err="failed to get container status \"0a2dfd5a8c288721fb08a0bee8d534b05b0460ed66ad06257a8dec074ed88e3f\": rpc error: code = NotFound desc = could not find container \"0a2dfd5a8c288721fb08a0bee8d534b05b0460ed66ad06257a8dec074ed88e3f\": container with ID starting with 0a2dfd5a8c288721fb08a0bee8d534b05b0460ed66ad06257a8dec074ed88e3f not found: ID does not exist" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.878269 4952 scope.go:117] "RemoveContainer" containerID="4b2b791268158fda57c42dd867cd67f2920b83cbe46877eb93841f19645345c2" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.879636 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "16fdf15b-0bbd-440a-835b-0ef98ef6b28d" (UID: "16fdf15b-0bbd-440a-835b-0ef98ef6b28d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:40 crc kubenswrapper[4952]: E1122 03:11:40.879892 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b2b791268158fda57c42dd867cd67f2920b83cbe46877eb93841f19645345c2\": container with ID starting with 4b2b791268158fda57c42dd867cd67f2920b83cbe46877eb93841f19645345c2 not found: ID does not exist" containerID="4b2b791268158fda57c42dd867cd67f2920b83cbe46877eb93841f19645345c2" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.879943 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b2b791268158fda57c42dd867cd67f2920b83cbe46877eb93841f19645345c2"} err="failed to get container status \"4b2b791268158fda57c42dd867cd67f2920b83cbe46877eb93841f19645345c2\": rpc error: code = NotFound desc = could not find container \"4b2b791268158fda57c42dd867cd67f2920b83cbe46877eb93841f19645345c2\": container with ID starting with 4b2b791268158fda57c42dd867cd67f2920b83cbe46877eb93841f19645345c2 not found: ID does not exist" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.880816 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-kube-api-access-22m9l" (OuterVolumeSpecName: "kube-api-access-22m9l") pod "16fdf15b-0bbd-440a-835b-0ef98ef6b28d" (UID: "16fdf15b-0bbd-440a-835b-0ef98ef6b28d"). InnerVolumeSpecName "kube-api-access-22m9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.898667 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16fdf15b-0bbd-440a-835b-0ef98ef6b28d" (UID: "16fdf15b-0bbd-440a-835b-0ef98ef6b28d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.922825 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-config-data" (OuterVolumeSpecName: "config-data") pod "16fdf15b-0bbd-440a-835b-0ef98ef6b28d" (UID: "16fdf15b-0bbd-440a-835b-0ef98ef6b28d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.975989 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.976032 4952 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.976043 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.976053 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22m9l\" (UniqueName: \"kubernetes.io/projected/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-kube-api-access-22m9l\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:40 crc kubenswrapper[4952]: I1122 03:11:40.976067 4952 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fdf15b-0bbd-440a-835b-0ef98ef6b28d-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:41 crc kubenswrapper[4952]: I1122 03:11:41.108757 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5897f54458-5r5nc"] Nov 22 03:11:41 crc kubenswrapper[4952]: I1122 03:11:41.109678 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5897f54458-5r5nc"] Nov 22 03:11:41 crc kubenswrapper[4952]: I1122 03:11:41.460878 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 22 03:11:41 crc kubenswrapper[4952]: I1122 03:11:41.778984 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2ff51385-a462-478d-bd61-62d15d7c5c41","Type":"ContainerStarted","Data":"464d381b31df816a649c28355b1b2973a550764c03a78a886dd54089bcfef5a0"} Nov 22 03:11:41 crc kubenswrapper[4952]: I1122 03:11:41.779404 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2ff51385-a462-478d-bd61-62d15d7c5c41","Type":"ContainerStarted","Data":"a12db9e33ed176d2a526f9ba1bdc03dd282f34ec5fa50f0b251b4d65430b7819"} Nov 22 03:11:42 crc kubenswrapper[4952]: I1122 03:11:42.546616 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16fdf15b-0bbd-440a-835b-0ef98ef6b28d" path="/var/lib/kubelet/pods/16fdf15b-0bbd-440a-835b-0ef98ef6b28d/volumes" Nov 22 03:11:42 crc kubenswrapper[4952]: I1122 03:11:42.804590 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2ff51385-a462-478d-bd61-62d15d7c5c41","Type":"ContainerStarted","Data":"565d9212ba16c55f6b2d3c756f1d42c49c81ae98bf941136f9bd6e02f6c09837"} Nov 22 03:11:42 crc kubenswrapper[4952]: I1122 03:11:42.856662 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.856638233 podStartE2EDuration="3.856638233s" podCreationTimestamp="2025-11-22 03:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:42.847225839 +0000 UTC m=+1067.153243112" watchObservedRunningTime="2025-11-22 03:11:42.856638233 +0000 UTC m=+1067.162655506" Nov 22 03:11:44 crc kubenswrapper[4952]: E1122 03:11:44.752039 4952 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3701508_1734_46b5_83f3_9f08f930b294.slice/crio-conmon-e9bd255dde565b42faf43b862902b2902ef36a02022be70b9acc3b15327ddb3f.scope\": RecentStats: unable to find data in memory cache]" Nov 22 03:11:44 crc kubenswrapper[4952]: I1122 03:11:44.829412 4952 generic.go:334] "Generic (PLEG): container finished" podID="d3701508-1734-46b5-83f3-9f08f930b294" containerID="e9bd255dde565b42faf43b862902b2902ef36a02022be70b9acc3b15327ddb3f" exitCode=0 Nov 22 03:11:44 crc kubenswrapper[4952]: I1122 03:11:44.829476 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-856965f45b-8d4qq" event={"ID":"d3701508-1734-46b5-83f3-9f08f930b294","Type":"ContainerDied","Data":"e9bd255dde565b42faf43b862902b2902ef36a02022be70b9acc3b15327ddb3f"} Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.068525 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-lf6jj"] Nov 22 03:11:45 crc kubenswrapper[4952]: E1122 03:11:45.069563 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fdf15b-0bbd-440a-835b-0ef98ef6b28d" containerName="barbican-api-log" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.069672 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fdf15b-0bbd-440a-835b-0ef98ef6b28d" containerName="barbican-api-log" Nov 22 03:11:45 crc kubenswrapper[4952]: E1122 03:11:45.069769 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fdf15b-0bbd-440a-835b-0ef98ef6b28d" containerName="barbican-api" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.069883 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fdf15b-0bbd-440a-835b-0ef98ef6b28d" containerName="barbican-api" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.070163 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="16fdf15b-0bbd-440a-835b-0ef98ef6b28d" containerName="barbican-api" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.070235 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="16fdf15b-0bbd-440a-835b-0ef98ef6b28d" containerName="barbican-api-log" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.071088 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lf6jj" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.081335 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-lf6jj"] Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.145374 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.166883 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-8qlsn"] Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.168400 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8qlsn" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.173940 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm5js\" (UniqueName: \"kubernetes.io/projected/f5cc4a94-12fd-42a3-b2cb-7d04163ca285-kube-api-access-xm5js\") pod \"nova-api-db-create-lf6jj\" (UID: \"f5cc4a94-12fd-42a3-b2cb-7d04163ca285\") " pod="openstack/nova-api-db-create-lf6jj" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.174028 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5cc4a94-12fd-42a3-b2cb-7d04163ca285-operator-scripts\") pod \"nova-api-db-create-lf6jj\" (UID: \"f5cc4a94-12fd-42a3-b2cb-7d04163ca285\") " pod="openstack/nova-api-db-create-lf6jj" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.183118 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-8qlsn"] Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.271711 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2d2e-account-create-kxv6b"] Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.273302 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2d2e-account-create-kxv6b" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.276609 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdszt\" (UniqueName: \"kubernetes.io/projected/44db129c-1074-4f16-9c71-31a59ddd62a5-kube-api-access-xdszt\") pod \"nova-cell0-db-create-8qlsn\" (UID: \"44db129c-1074-4f16-9c71-31a59ddd62a5\") " pod="openstack/nova-cell0-db-create-8qlsn" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.276655 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm5js\" (UniqueName: \"kubernetes.io/projected/f5cc4a94-12fd-42a3-b2cb-7d04163ca285-kube-api-access-xm5js\") pod \"nova-api-db-create-lf6jj\" (UID: \"f5cc4a94-12fd-42a3-b2cb-7d04163ca285\") " pod="openstack/nova-api-db-create-lf6jj" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.276699 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5cc4a94-12fd-42a3-b2cb-7d04163ca285-operator-scripts\") pod \"nova-api-db-create-lf6jj\" (UID: \"f5cc4a94-12fd-42a3-b2cb-7d04163ca285\") " pod="openstack/nova-api-db-create-lf6jj" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.276774 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44db129c-1074-4f16-9c71-31a59ddd62a5-operator-scripts\") pod \"nova-cell0-db-create-8qlsn\" (UID: \"44db129c-1074-4f16-9c71-31a59ddd62a5\") " pod="openstack/nova-cell0-db-create-8qlsn" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.278367 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5cc4a94-12fd-42a3-b2cb-7d04163ca285-operator-scripts\") pod \"nova-api-db-create-lf6jj\" (UID: \"f5cc4a94-12fd-42a3-b2cb-7d04163ca285\") " pod="openstack/nova-api-db-create-lf6jj" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.278743 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.286530 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-8ddxz"] Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.289174 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8ddxz" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.316694 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2d2e-account-create-kxv6b"] Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.325446 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm5js\" (UniqueName: \"kubernetes.io/projected/f5cc4a94-12fd-42a3-b2cb-7d04163ca285-kube-api-access-xm5js\") pod \"nova-api-db-create-lf6jj\" (UID: \"f5cc4a94-12fd-42a3-b2cb-7d04163ca285\") " pod="openstack/nova-api-db-create-lf6jj" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.330616 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8ddxz"] Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.378747 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44db129c-1074-4f16-9c71-31a59ddd62a5-operator-scripts\") pod \"nova-cell0-db-create-8qlsn\" (UID: \"44db129c-1074-4f16-9c71-31a59ddd62a5\") " pod="openstack/nova-cell0-db-create-8qlsn" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.378843 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc5wq\" (UniqueName: \"kubernetes.io/projected/dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c-kube-api-access-qc5wq\") pod \"nova-api-2d2e-account-create-kxv6b\" (UID: \"dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c\") " pod="openstack/nova-api-2d2e-account-create-kxv6b" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.378883 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c-operator-scripts\") pod \"nova-api-2d2e-account-create-kxv6b\" (UID: \"dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c\") " pod="openstack/nova-api-2d2e-account-create-kxv6b" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.378936 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7kqr\" (UniqueName: \"kubernetes.io/projected/74840955-d478-4dfb-a30d-9cff482b4e7e-kube-api-access-l7kqr\") pod \"nova-cell1-db-create-8ddxz\" (UID: \"74840955-d478-4dfb-a30d-9cff482b4e7e\") " pod="openstack/nova-cell1-db-create-8ddxz" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.378961 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdszt\" (UniqueName: \"kubernetes.io/projected/44db129c-1074-4f16-9c71-31a59ddd62a5-kube-api-access-xdszt\") pod \"nova-cell0-db-create-8qlsn\" (UID: \"44db129c-1074-4f16-9c71-31a59ddd62a5\") " pod="openstack/nova-cell0-db-create-8qlsn" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.379003 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74840955-d478-4dfb-a30d-9cff482b4e7e-operator-scripts\") pod \"nova-cell1-db-create-8ddxz\" (UID: \"74840955-d478-4dfb-a30d-9cff482b4e7e\") " pod="openstack/nova-cell1-db-create-8ddxz" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.379801 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44db129c-1074-4f16-9c71-31a59ddd62a5-operator-scripts\") pod \"nova-cell0-db-create-8qlsn\" (UID: \"44db129c-1074-4f16-9c71-31a59ddd62a5\") " pod="openstack/nova-cell0-db-create-8qlsn" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.397583 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdszt\" (UniqueName: \"kubernetes.io/projected/44db129c-1074-4f16-9c71-31a59ddd62a5-kube-api-access-xdszt\") pod \"nova-cell0-db-create-8qlsn\" (UID: \"44db129c-1074-4f16-9c71-31a59ddd62a5\") " pod="openstack/nova-cell0-db-create-8qlsn" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.417810 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lf6jj" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.469683 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-ac7e-account-create-p4htv"] Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.471461 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ac7e-account-create-p4htv" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.480669 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7kqr\" (UniqueName: \"kubernetes.io/projected/74840955-d478-4dfb-a30d-9cff482b4e7e-kube-api-access-l7kqr\") pod \"nova-cell1-db-create-8ddxz\" (UID: \"74840955-d478-4dfb-a30d-9cff482b4e7e\") " pod="openstack/nova-cell1-db-create-8ddxz" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.480764 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74840955-d478-4dfb-a30d-9cff482b4e7e-operator-scripts\") pod \"nova-cell1-db-create-8ddxz\" (UID: \"74840955-d478-4dfb-a30d-9cff482b4e7e\") " pod="openstack/nova-cell1-db-create-8ddxz" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.480865 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc5wq\" (UniqueName: \"kubernetes.io/projected/dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c-kube-api-access-qc5wq\") pod \"nova-api-2d2e-account-create-kxv6b\" (UID: \"dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c\") " pod="openstack/nova-api-2d2e-account-create-kxv6b" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.480891 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c-operator-scripts\") pod \"nova-api-2d2e-account-create-kxv6b\" (UID: \"dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c\") " pod="openstack/nova-api-2d2e-account-create-kxv6b" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.481655 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ac7e-account-create-p4htv"] Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.481808 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c-operator-scripts\") pod \"nova-api-2d2e-account-create-kxv6b\" (UID: \"dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c\") " pod="openstack/nova-api-2d2e-account-create-kxv6b" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.482035 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.482584 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74840955-d478-4dfb-a30d-9cff482b4e7e-operator-scripts\") pod \"nova-cell1-db-create-8ddxz\" (UID: \"74840955-d478-4dfb-a30d-9cff482b4e7e\") " pod="openstack/nova-cell1-db-create-8ddxz" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.505268 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8qlsn" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.507488 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7kqr\" (UniqueName: \"kubernetes.io/projected/74840955-d478-4dfb-a30d-9cff482b4e7e-kube-api-access-l7kqr\") pod \"nova-cell1-db-create-8ddxz\" (UID: \"74840955-d478-4dfb-a30d-9cff482b4e7e\") " pod="openstack/nova-cell1-db-create-8ddxz" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.523427 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc5wq\" (UniqueName: \"kubernetes.io/projected/dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c-kube-api-access-qc5wq\") pod \"nova-api-2d2e-account-create-kxv6b\" (UID: \"dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c\") " pod="openstack/nova-api-2d2e-account-create-kxv6b" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.583408 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm2t6\" (UniqueName: \"kubernetes.io/projected/73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f-kube-api-access-dm2t6\") pod \"nova-cell0-ac7e-account-create-p4htv\" (UID: \"73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f\") " pod="openstack/nova-cell0-ac7e-account-create-p4htv" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.583480 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f-operator-scripts\") pod \"nova-cell0-ac7e-account-create-p4htv\" (UID: \"73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f\") " pod="openstack/nova-cell0-ac7e-account-create-p4htv" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.594506 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2d2e-account-create-kxv6b" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.618439 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8ddxz" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.672008 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cf0c-account-create-24pqf"] Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.673844 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cf0c-account-create-24pqf" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.676183 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.686881 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cf0c-account-create-24pqf"] Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.690060 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca19638a-64bf-4f46-84d3-efd709c1593f-operator-scripts\") pod \"nova-cell1-cf0c-account-create-24pqf\" (UID: \"ca19638a-64bf-4f46-84d3-efd709c1593f\") " pod="openstack/nova-cell1-cf0c-account-create-24pqf" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.690351 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm2t6\" (UniqueName: \"kubernetes.io/projected/73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f-kube-api-access-dm2t6\") pod \"nova-cell0-ac7e-account-create-p4htv\" (UID: \"73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f\") " pod="openstack/nova-cell0-ac7e-account-create-p4htv" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.690389 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8p5b\" (UniqueName: \"kubernetes.io/projected/ca19638a-64bf-4f46-84d3-efd709c1593f-kube-api-access-l8p5b\") pod \"nova-cell1-cf0c-account-create-24pqf\" (UID: \"ca19638a-64bf-4f46-84d3-efd709c1593f\") " pod="openstack/nova-cell1-cf0c-account-create-24pqf" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.690482 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f-operator-scripts\") pod \"nova-cell0-ac7e-account-create-p4htv\" (UID: \"73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f\") " pod="openstack/nova-cell0-ac7e-account-create-p4htv" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.693968 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f-operator-scripts\") pod \"nova-cell0-ac7e-account-create-p4htv\" (UID: \"73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f\") " pod="openstack/nova-cell0-ac7e-account-create-p4htv" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.715965 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm2t6\" (UniqueName: \"kubernetes.io/projected/73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f-kube-api-access-dm2t6\") pod \"nova-cell0-ac7e-account-create-p4htv\" (UID: \"73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f\") " pod="openstack/nova-cell0-ac7e-account-create-p4htv" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.792664 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca19638a-64bf-4f46-84d3-efd709c1593f-operator-scripts\") pod \"nova-cell1-cf0c-account-create-24pqf\" (UID: \"ca19638a-64bf-4f46-84d3-efd709c1593f\") " pod="openstack/nova-cell1-cf0c-account-create-24pqf" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.792800 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8p5b\" (UniqueName: \"kubernetes.io/projected/ca19638a-64bf-4f46-84d3-efd709c1593f-kube-api-access-l8p5b\") pod \"nova-cell1-cf0c-account-create-24pqf\" (UID: \"ca19638a-64bf-4f46-84d3-efd709c1593f\") " pod="openstack/nova-cell1-cf0c-account-create-24pqf" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.793760 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca19638a-64bf-4f46-84d3-efd709c1593f-operator-scripts\") pod \"nova-cell1-cf0c-account-create-24pqf\" (UID: \"ca19638a-64bf-4f46-84d3-efd709c1593f\") " pod="openstack/nova-cell1-cf0c-account-create-24pqf" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.811369 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ac7e-account-create-p4htv" Nov 22 03:11:45 crc kubenswrapper[4952]: I1122 03:11:45.823315 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8p5b\" (UniqueName: \"kubernetes.io/projected/ca19638a-64bf-4f46-84d3-efd709c1593f-kube-api-access-l8p5b\") pod \"nova-cell1-cf0c-account-create-24pqf\" (UID: \"ca19638a-64bf-4f46-84d3-efd709c1593f\") " pod="openstack/nova-cell1-cf0c-account-create-24pqf" Nov 22 03:11:46 crc kubenswrapper[4952]: I1122 03:11:46.006330 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cf0c-account-create-24pqf" Nov 22 03:11:48 crc kubenswrapper[4952]: I1122 03:11:48.564737 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 22 03:11:49 crc kubenswrapper[4952]: I1122 03:11:49.622741 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:49 crc kubenswrapper[4952]: I1122 03:11:49.626286 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b90cc371-f277-4e5c-9e95-5b8233d75503" containerName="ceilometer-central-agent" containerID="cri-o://a523636e7e2301571fa6df70c565c588832cbdf72da50d7ee998a2bf1a992aa4" gracePeriod=30 Nov 22 03:11:49 crc kubenswrapper[4952]: I1122 03:11:49.626333 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b90cc371-f277-4e5c-9e95-5b8233d75503" containerName="sg-core" containerID="cri-o://4563323ac5588f3f9f959c00bdc334a9fd0fdb751a15e8b2dadb669cb65144c8" gracePeriod=30 Nov 22 03:11:49 crc kubenswrapper[4952]: I1122 03:11:49.626485 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b90cc371-f277-4e5c-9e95-5b8233d75503" containerName="ceilometer-notification-agent" containerID="cri-o://719789bbe8a533e735cafc09236a0eb8434a6c4520aa604f6afd472ee1e89034" gracePeriod=30 Nov 22 03:11:49 crc kubenswrapper[4952]: I1122 03:11:49.626595 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b90cc371-f277-4e5c-9e95-5b8233d75503" containerName="proxy-httpd" containerID="cri-o://51d900984d182c81497d901cd7fe4146c0ead69e15ad2229678df2eb2b4ee2eb" gracePeriod=30 Nov 22 03:11:50 crc kubenswrapper[4952]: I1122 03:11:50.453631 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 22 03:11:50 crc kubenswrapper[4952]: I1122 03:11:50.908932 4952 generic.go:334] "Generic (PLEG): container finished" podID="b90cc371-f277-4e5c-9e95-5b8233d75503" containerID="51d900984d182c81497d901cd7fe4146c0ead69e15ad2229678df2eb2b4ee2eb" exitCode=0 Nov 22 03:11:50 crc kubenswrapper[4952]: I1122 03:11:50.910022 4952 generic.go:334] "Generic (PLEG): container finished" podID="b90cc371-f277-4e5c-9e95-5b8233d75503" containerID="4563323ac5588f3f9f959c00bdc334a9fd0fdb751a15e8b2dadb669cb65144c8" exitCode=2 Nov 22 03:11:50 crc kubenswrapper[4952]: I1122 03:11:50.910097 4952 generic.go:334] "Generic (PLEG): container finished" podID="b90cc371-f277-4e5c-9e95-5b8233d75503" containerID="a523636e7e2301571fa6df70c565c588832cbdf72da50d7ee998a2bf1a992aa4" exitCode=0 Nov 22 03:11:50 crc kubenswrapper[4952]: I1122 03:11:50.909195 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b90cc371-f277-4e5c-9e95-5b8233d75503","Type":"ContainerDied","Data":"51d900984d182c81497d901cd7fe4146c0ead69e15ad2229678df2eb2b4ee2eb"} Nov 22 03:11:50 crc kubenswrapper[4952]: I1122 03:11:50.910239 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b90cc371-f277-4e5c-9e95-5b8233d75503","Type":"ContainerDied","Data":"4563323ac5588f3f9f959c00bdc334a9fd0fdb751a15e8b2dadb669cb65144c8"} Nov 22 03:11:50 crc kubenswrapper[4952]: I1122 03:11:50.910314 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b90cc371-f277-4e5c-9e95-5b8233d75503","Type":"ContainerDied","Data":"a523636e7e2301571fa6df70c565c588832cbdf72da50d7ee998a2bf1a992aa4"} Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.648423 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.740858 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-httpd-config\") pod \"d3701508-1734-46b5-83f3-9f08f930b294\" (UID: \"d3701508-1734-46b5-83f3-9f08f930b294\") " Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.741069 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-config\") pod \"d3701508-1734-46b5-83f3-9f08f930b294\" (UID: \"d3701508-1734-46b5-83f3-9f08f930b294\") " Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.741198 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-ovndb-tls-certs\") pod \"d3701508-1734-46b5-83f3-9f08f930b294\" (UID: \"d3701508-1734-46b5-83f3-9f08f930b294\") " Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.741218 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-combined-ca-bundle\") pod \"d3701508-1734-46b5-83f3-9f08f930b294\" (UID: \"d3701508-1734-46b5-83f3-9f08f930b294\") " Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.741256 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw4sn\" (UniqueName: \"kubernetes.io/projected/d3701508-1734-46b5-83f3-9f08f930b294-kube-api-access-kw4sn\") pod \"d3701508-1734-46b5-83f3-9f08f930b294\" (UID: \"d3701508-1734-46b5-83f3-9f08f930b294\") " Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.748855 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d3701508-1734-46b5-83f3-9f08f930b294" (UID: "d3701508-1734-46b5-83f3-9f08f930b294"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.749770 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3701508-1734-46b5-83f3-9f08f930b294-kube-api-access-kw4sn" (OuterVolumeSpecName: "kube-api-access-kw4sn") pod "d3701508-1734-46b5-83f3-9f08f930b294" (UID: "d3701508-1734-46b5-83f3-9f08f930b294"). InnerVolumeSpecName "kube-api-access-kw4sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.794917 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3701508-1734-46b5-83f3-9f08f930b294" (UID: "d3701508-1734-46b5-83f3-9f08f930b294"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.800158 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-config" (OuterVolumeSpecName: "config") pod "d3701508-1734-46b5-83f3-9f08f930b294" (UID: "d3701508-1734-46b5-83f3-9f08f930b294"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.824244 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d3701508-1734-46b5-83f3-9f08f930b294" (UID: "d3701508-1734-46b5-83f3-9f08f930b294"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.846737 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.846788 4952 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.846803 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.846822 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw4sn\" (UniqueName: \"kubernetes.io/projected/d3701508-1734-46b5-83f3-9f08f930b294-kube-api-access-kw4sn\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.846837 4952 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d3701508-1734-46b5-83f3-9f08f930b294-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.904249 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ac7e-account-create-p4htv"] Nov 22 03:11:51 crc kubenswrapper[4952]: W1122 03:11:51.910350 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73e75a6d_eb3a_4e92_a13c_8ab87d2beb4f.slice/crio-497b82fad7e229080ace00a0945665263497c68cc9d63998afe3757488acf936 WatchSource:0}: Error finding container 497b82fad7e229080ace00a0945665263497c68cc9d63998afe3757488acf936: Status 404 returned error can't find the container with id 497b82fad7e229080ace00a0945665263497c68cc9d63998afe3757488acf936 Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.914370 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8ddxz"] Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.928415 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-856965f45b-8d4qq" event={"ID":"d3701508-1734-46b5-83f3-9f08f930b294","Type":"ContainerDied","Data":"065c40827d83ee6dc4c8bd823d97444fc3a165ea52aad31d330ca8672ed719d2"} Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.928476 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-856965f45b-8d4qq" Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.928505 4952 scope.go:117] "RemoveContainer" containerID="de75c61bcfa98d6971fb7470dfcdbd2a151952e28b151207f88afbfd957018f8" Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.936104 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4768e499-4f1e-4a22-9e20-b05cf83f1c89","Type":"ContainerStarted","Data":"4de56ddbb5da21424b21c4907cfd2aafd07b73ed9d586dcb30e158d6a67d21b3"} Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.959263 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.366170256 podStartE2EDuration="15.959235476s" podCreationTimestamp="2025-11-22 03:11:36 +0000 UTC" firstStartedPulling="2025-11-22 03:11:37.606718771 +0000 UTC m=+1061.912736044" lastFinishedPulling="2025-11-22 03:11:51.199783991 +0000 UTC m=+1075.505801264" observedRunningTime="2025-11-22 03:11:51.951790655 +0000 UTC m=+1076.257807938" watchObservedRunningTime="2025-11-22 03:11:51.959235476 +0000 UTC m=+1076.265252749" Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.986163 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-856965f45b-8d4qq"] Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.993001 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-856965f45b-8d4qq"] Nov 22 03:11:51 crc kubenswrapper[4952]: I1122 03:11:51.993326 4952 scope.go:117] "RemoveContainer" containerID="e9bd255dde565b42faf43b862902b2902ef36a02022be70b9acc3b15327ddb3f" Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.142215 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cf0c-account-create-24pqf"] Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.148207 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-lf6jj"] Nov 22 03:11:52 crc kubenswrapper[4952]: W1122 03:11:52.155706 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44db129c_1074_4f16_9c71_31a59ddd62a5.slice/crio-8def859cae8fa75c1bcf1d617172eee933ee1a043615d55a19daf25b69ec99e1 WatchSource:0}: Error finding container 8def859cae8fa75c1bcf1d617172eee933ee1a043615d55a19daf25b69ec99e1: Status 404 returned error can't find the container with id 8def859cae8fa75c1bcf1d617172eee933ee1a043615d55a19daf25b69ec99e1 Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.156730 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-8qlsn"] Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.167181 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2d2e-account-create-kxv6b"] Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.549530 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3701508-1734-46b5-83f3-9f08f930b294" path="/var/lib/kubelet/pods/d3701508-1734-46b5-83f3-9f08f930b294/volumes" Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.950346 4952 generic.go:334] "Generic (PLEG): container finished" podID="ca19638a-64bf-4f46-84d3-efd709c1593f" containerID="4407dc190d3c77ff016b7c68de5c18baa7da4a3fa665d1b70b120dd9aefbcc71" exitCode=0 Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.950435 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cf0c-account-create-24pqf" event={"ID":"ca19638a-64bf-4f46-84d3-efd709c1593f","Type":"ContainerDied","Data":"4407dc190d3c77ff016b7c68de5c18baa7da4a3fa665d1b70b120dd9aefbcc71"} Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.950473 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cf0c-account-create-24pqf" event={"ID":"ca19638a-64bf-4f46-84d3-efd709c1593f","Type":"ContainerStarted","Data":"02a3c05456caea4b8381948424985423f54feed051cf5c4a6539f31bb1d71c32"} Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.963512 4952 generic.go:334] "Generic (PLEG): container finished" podID="dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c" containerID="17299fdc2a7741cbafd0c25b49e6d0dd900bc3d7a5e74b1a793d5d0e1982d30e" exitCode=0 Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.963752 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2d2e-account-create-kxv6b" event={"ID":"dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c","Type":"ContainerDied","Data":"17299fdc2a7741cbafd0c25b49e6d0dd900bc3d7a5e74b1a793d5d0e1982d30e"} Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.964009 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2d2e-account-create-kxv6b" event={"ID":"dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c","Type":"ContainerStarted","Data":"304ddb5e86a3128b2371c0c3d424fd8d0305ca6c5868325f1b6603e439dbf2d6"} Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.967359 4952 generic.go:334] "Generic (PLEG): container finished" podID="73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f" containerID="bd98db4a7b48722970b5456cc920c7608b8abb3b36fd5e7f18956ea06a0f6561" exitCode=0 Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.967407 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ac7e-account-create-p4htv" event={"ID":"73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f","Type":"ContainerDied","Data":"bd98db4a7b48722970b5456cc920c7608b8abb3b36fd5e7f18956ea06a0f6561"} Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.967640 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ac7e-account-create-p4htv" event={"ID":"73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f","Type":"ContainerStarted","Data":"497b82fad7e229080ace00a0945665263497c68cc9d63998afe3757488acf936"} Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.969703 4952 generic.go:334] "Generic (PLEG): container finished" podID="74840955-d478-4dfb-a30d-9cff482b4e7e" containerID="e64e18e0ef8bb22854ad2fb7150a00e9045015c98fb158ea5bb96431a2d02b05" exitCode=0 Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.969860 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8ddxz" event={"ID":"74840955-d478-4dfb-a30d-9cff482b4e7e","Type":"ContainerDied","Data":"e64e18e0ef8bb22854ad2fb7150a00e9045015c98fb158ea5bb96431a2d02b05"} Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.969932 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8ddxz" event={"ID":"74840955-d478-4dfb-a30d-9cff482b4e7e","Type":"ContainerStarted","Data":"3c524e5519d0b87a7c6b7277dc0f69b9da909592890261c7e615033235220fc5"} Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.976362 4952 generic.go:334] "Generic (PLEG): container finished" podID="44db129c-1074-4f16-9c71-31a59ddd62a5" containerID="7c0217a1e066fe1a68299984058ad5be4a2e12998d7ae80458072bfc845d07d9" exitCode=0 Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.976490 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8qlsn" event={"ID":"44db129c-1074-4f16-9c71-31a59ddd62a5","Type":"ContainerDied","Data":"7c0217a1e066fe1a68299984058ad5be4a2e12998d7ae80458072bfc845d07d9"} Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.976561 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8qlsn" event={"ID":"44db129c-1074-4f16-9c71-31a59ddd62a5","Type":"ContainerStarted","Data":"8def859cae8fa75c1bcf1d617172eee933ee1a043615d55a19daf25b69ec99e1"} Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.982372 4952 generic.go:334] "Generic (PLEG): container finished" podID="f5cc4a94-12fd-42a3-b2cb-7d04163ca285" containerID="8c3835e24c1374eeee640debac1908efdf2a4d40b0a7aca1579f293751d88570" exitCode=0 Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.983510 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lf6jj" event={"ID":"f5cc4a94-12fd-42a3-b2cb-7d04163ca285","Type":"ContainerDied","Data":"8c3835e24c1374eeee640debac1908efdf2a4d40b0a7aca1579f293751d88570"} Nov 22 03:11:52 crc kubenswrapper[4952]: I1122 03:11:52.983565 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lf6jj" event={"ID":"f5cc4a94-12fd-42a3-b2cb-7d04163ca285","Type":"ContainerStarted","Data":"c49a14f4017c7995264e6d466b9aa93064532ec8875fbe41d8fa8d6685a6db38"} Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.454071 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8qlsn" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.509428 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44db129c-1074-4f16-9c71-31a59ddd62a5-operator-scripts\") pod \"44db129c-1074-4f16-9c71-31a59ddd62a5\" (UID: \"44db129c-1074-4f16-9c71-31a59ddd62a5\") " Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.509557 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdszt\" (UniqueName: \"kubernetes.io/projected/44db129c-1074-4f16-9c71-31a59ddd62a5-kube-api-access-xdszt\") pod \"44db129c-1074-4f16-9c71-31a59ddd62a5\" (UID: \"44db129c-1074-4f16-9c71-31a59ddd62a5\") " Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.510927 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44db129c-1074-4f16-9c71-31a59ddd62a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44db129c-1074-4f16-9c71-31a59ddd62a5" (UID: "44db129c-1074-4f16-9c71-31a59ddd62a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.519756 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44db129c-1074-4f16-9c71-31a59ddd62a5-kube-api-access-xdszt" (OuterVolumeSpecName: "kube-api-access-xdszt") pod "44db129c-1074-4f16-9c71-31a59ddd62a5" (UID: "44db129c-1074-4f16-9c71-31a59ddd62a5"). InnerVolumeSpecName "kube-api-access-xdszt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.620364 4952 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44db129c-1074-4f16-9c71-31a59ddd62a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.620414 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdszt\" (UniqueName: \"kubernetes.io/projected/44db129c-1074-4f16-9c71-31a59ddd62a5-kube-api-access-xdszt\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.692193 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.739727 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2d2e-account-create-kxv6b" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.749224 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8ddxz" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.765338 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cf0c-account-create-24pqf" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.778926 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lf6jj" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.787299 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ac7e-account-create-p4htv" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.823851 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-config-data\") pod \"b90cc371-f277-4e5c-9e95-5b8233d75503\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.824617 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74840955-d478-4dfb-a30d-9cff482b4e7e-operator-scripts\") pod \"74840955-d478-4dfb-a30d-9cff482b4e7e\" (UID: \"74840955-d478-4dfb-a30d-9cff482b4e7e\") " Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.826770 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7kqr\" (UniqueName: \"kubernetes.io/projected/74840955-d478-4dfb-a30d-9cff482b4e7e-kube-api-access-l7kqr\") pod \"74840955-d478-4dfb-a30d-9cff482b4e7e\" (UID: \"74840955-d478-4dfb-a30d-9cff482b4e7e\") " Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.826956 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b90cc371-f277-4e5c-9e95-5b8233d75503-run-httpd\") pod \"b90cc371-f277-4e5c-9e95-5b8233d75503\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.826998 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc5wq\" (UniqueName: \"kubernetes.io/projected/dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c-kube-api-access-qc5wq\") pod \"dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c\" (UID: \"dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c\") " Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.827026 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-sg-core-conf-yaml\") pod \"b90cc371-f277-4e5c-9e95-5b8233d75503\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.827052 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8p5b\" (UniqueName: \"kubernetes.io/projected/ca19638a-64bf-4f46-84d3-efd709c1593f-kube-api-access-l8p5b\") pod \"ca19638a-64bf-4f46-84d3-efd709c1593f\" (UID: \"ca19638a-64bf-4f46-84d3-efd709c1593f\") " Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.827092 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twctv\" (UniqueName: \"kubernetes.io/projected/b90cc371-f277-4e5c-9e95-5b8233d75503-kube-api-access-twctv\") pod \"b90cc371-f277-4e5c-9e95-5b8233d75503\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.827124 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-scripts\") pod \"b90cc371-f277-4e5c-9e95-5b8233d75503\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.827169 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm5js\" (UniqueName: \"kubernetes.io/projected/f5cc4a94-12fd-42a3-b2cb-7d04163ca285-kube-api-access-xm5js\") pod \"f5cc4a94-12fd-42a3-b2cb-7d04163ca285\" (UID: \"f5cc4a94-12fd-42a3-b2cb-7d04163ca285\") " Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.826338 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74840955-d478-4dfb-a30d-9cff482b4e7e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "74840955-d478-4dfb-a30d-9cff482b4e7e" (UID: "74840955-d478-4dfb-a30d-9cff482b4e7e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.828595 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b90cc371-f277-4e5c-9e95-5b8233d75503-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b90cc371-f277-4e5c-9e95-5b8233d75503" (UID: "b90cc371-f277-4e5c-9e95-5b8233d75503"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.831707 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b90cc371-f277-4e5c-9e95-5b8233d75503-log-httpd\") pod \"b90cc371-f277-4e5c-9e95-5b8233d75503\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.831882 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5cc4a94-12fd-42a3-b2cb-7d04163ca285-kube-api-access-xm5js" (OuterVolumeSpecName: "kube-api-access-xm5js") pod "f5cc4a94-12fd-42a3-b2cb-7d04163ca285" (UID: "f5cc4a94-12fd-42a3-b2cb-7d04163ca285"). InnerVolumeSpecName "kube-api-access-xm5js". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.832141 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b90cc371-f277-4e5c-9e95-5b8233d75503-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b90cc371-f277-4e5c-9e95-5b8233d75503" (UID: "b90cc371-f277-4e5c-9e95-5b8233d75503"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.832226 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f-operator-scripts\") pod \"73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f\" (UID: \"73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f\") " Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.832269 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c-operator-scripts\") pod \"dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c\" (UID: \"dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c\") " Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.832359 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm2t6\" (UniqueName: \"kubernetes.io/projected/73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f-kube-api-access-dm2t6\") pod \"73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f\" (UID: \"73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f\") " Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.832392 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca19638a-64bf-4f46-84d3-efd709c1593f-operator-scripts\") pod \"ca19638a-64bf-4f46-84d3-efd709c1593f\" (UID: \"ca19638a-64bf-4f46-84d3-efd709c1593f\") " Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.832429 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-combined-ca-bundle\") pod \"b90cc371-f277-4e5c-9e95-5b8233d75503\" (UID: \"b90cc371-f277-4e5c-9e95-5b8233d75503\") " Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.832461 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5cc4a94-12fd-42a3-b2cb-7d04163ca285-operator-scripts\") pod \"f5cc4a94-12fd-42a3-b2cb-7d04163ca285\" (UID: \"f5cc4a94-12fd-42a3-b2cb-7d04163ca285\") " Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.834137 4952 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74840955-d478-4dfb-a30d-9cff482b4e7e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.834161 4952 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b90cc371-f277-4e5c-9e95-5b8233d75503-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.834172 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm5js\" (UniqueName: \"kubernetes.io/projected/f5cc4a94-12fd-42a3-b2cb-7d04163ca285-kube-api-access-xm5js\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.834182 4952 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b90cc371-f277-4e5c-9e95-5b8233d75503-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.835447 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca19638a-64bf-4f46-84d3-efd709c1593f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ca19638a-64bf-4f46-84d3-efd709c1593f" (UID: "ca19638a-64bf-4f46-84d3-efd709c1593f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.835469 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c" (UID: "dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.835976 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5cc4a94-12fd-42a3-b2cb-7d04163ca285-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f5cc4a94-12fd-42a3-b2cb-7d04163ca285" (UID: "f5cc4a94-12fd-42a3-b2cb-7d04163ca285"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.836028 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f" (UID: "73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.837042 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-scripts" (OuterVolumeSpecName: "scripts") pod "b90cc371-f277-4e5c-9e95-5b8233d75503" (UID: "b90cc371-f277-4e5c-9e95-5b8233d75503"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.837102 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c-kube-api-access-qc5wq" (OuterVolumeSpecName: "kube-api-access-qc5wq") pod "dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c" (UID: "dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c"). InnerVolumeSpecName "kube-api-access-qc5wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.837966 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca19638a-64bf-4f46-84d3-efd709c1593f-kube-api-access-l8p5b" (OuterVolumeSpecName: "kube-api-access-l8p5b") pod "ca19638a-64bf-4f46-84d3-efd709c1593f" (UID: "ca19638a-64bf-4f46-84d3-efd709c1593f"). InnerVolumeSpecName "kube-api-access-l8p5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.838460 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b90cc371-f277-4e5c-9e95-5b8233d75503-kube-api-access-twctv" (OuterVolumeSpecName: "kube-api-access-twctv") pod "b90cc371-f277-4e5c-9e95-5b8233d75503" (UID: "b90cc371-f277-4e5c-9e95-5b8233d75503"). InnerVolumeSpecName "kube-api-access-twctv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.839261 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f-kube-api-access-dm2t6" (OuterVolumeSpecName: "kube-api-access-dm2t6") pod "73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f" (UID: "73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f"). InnerVolumeSpecName "kube-api-access-dm2t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.856996 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74840955-d478-4dfb-a30d-9cff482b4e7e-kube-api-access-l7kqr" (OuterVolumeSpecName: "kube-api-access-l7kqr") pod "74840955-d478-4dfb-a30d-9cff482b4e7e" (UID: "74840955-d478-4dfb-a30d-9cff482b4e7e"). InnerVolumeSpecName "kube-api-access-l7kqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.874321 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b90cc371-f277-4e5c-9e95-5b8233d75503" (UID: "b90cc371-f277-4e5c-9e95-5b8233d75503"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.936775 4952 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5cc4a94-12fd-42a3-b2cb-7d04163ca285-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.936823 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7kqr\" (UniqueName: \"kubernetes.io/projected/74840955-d478-4dfb-a30d-9cff482b4e7e-kube-api-access-l7kqr\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.936838 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc5wq\" (UniqueName: \"kubernetes.io/projected/dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c-kube-api-access-qc5wq\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.936848 4952 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.936860 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8p5b\" (UniqueName: \"kubernetes.io/projected/ca19638a-64bf-4f46-84d3-efd709c1593f-kube-api-access-l8p5b\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.936875 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twctv\" (UniqueName: \"kubernetes.io/projected/b90cc371-f277-4e5c-9e95-5b8233d75503-kube-api-access-twctv\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.936886 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.936898 4952 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.936908 4952 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.936919 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm2t6\" (UniqueName: \"kubernetes.io/projected/73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f-kube-api-access-dm2t6\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.936930 4952 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca19638a-64bf-4f46-84d3-efd709c1593f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.944586 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b90cc371-f277-4e5c-9e95-5b8233d75503" (UID: "b90cc371-f277-4e5c-9e95-5b8233d75503"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4952]: I1122 03:11:54.952493 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-config-data" (OuterVolumeSpecName: "config-data") pod "b90cc371-f277-4e5c-9e95-5b8233d75503" (UID: "b90cc371-f277-4e5c-9e95-5b8233d75503"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.003570 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cf0c-account-create-24pqf" event={"ID":"ca19638a-64bf-4f46-84d3-efd709c1593f","Type":"ContainerDied","Data":"02a3c05456caea4b8381948424985423f54feed051cf5c4a6539f31bb1d71c32"} Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.003932 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02a3c05456caea4b8381948424985423f54feed051cf5c4a6539f31bb1d71c32" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.003597 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cf0c-account-create-24pqf" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.013905 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2d2e-account-create-kxv6b" event={"ID":"dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c","Type":"ContainerDied","Data":"304ddb5e86a3128b2371c0c3d424fd8d0305ca6c5868325f1b6603e439dbf2d6"} Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.013932 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2d2e-account-create-kxv6b" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.014067 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="304ddb5e86a3128b2371c0c3d424fd8d0305ca6c5868325f1b6603e439dbf2d6" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.016483 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ac7e-account-create-p4htv" event={"ID":"73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f","Type":"ContainerDied","Data":"497b82fad7e229080ace00a0945665263497c68cc9d63998afe3757488acf936"} Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.016521 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="497b82fad7e229080ace00a0945665263497c68cc9d63998afe3757488acf936" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.016538 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ac7e-account-create-p4htv" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.020069 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8ddxz" event={"ID":"74840955-d478-4dfb-a30d-9cff482b4e7e","Type":"ContainerDied","Data":"3c524e5519d0b87a7c6b7277dc0f69b9da909592890261c7e615033235220fc5"} Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.020114 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c524e5519d0b87a7c6b7277dc0f69b9da909592890261c7e615033235220fc5" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.020164 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8ddxz" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.021822 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8qlsn" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.021834 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8qlsn" event={"ID":"44db129c-1074-4f16-9c71-31a59ddd62a5","Type":"ContainerDied","Data":"8def859cae8fa75c1bcf1d617172eee933ee1a043615d55a19daf25b69ec99e1"} Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.022049 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8def859cae8fa75c1bcf1d617172eee933ee1a043615d55a19daf25b69ec99e1" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.024259 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lf6jj" event={"ID":"f5cc4a94-12fd-42a3-b2cb-7d04163ca285","Type":"ContainerDied","Data":"c49a14f4017c7995264e6d466b9aa93064532ec8875fbe41d8fa8d6685a6db38"} Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.024296 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c49a14f4017c7995264e6d466b9aa93064532ec8875fbe41d8fa8d6685a6db38" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.024394 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lf6jj" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.029166 4952 generic.go:334] "Generic (PLEG): container finished" podID="b90cc371-f277-4e5c-9e95-5b8233d75503" containerID="719789bbe8a533e735cafc09236a0eb8434a6c4520aa604f6afd472ee1e89034" exitCode=0 Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.029232 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.029263 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b90cc371-f277-4e5c-9e95-5b8233d75503","Type":"ContainerDied","Data":"719789bbe8a533e735cafc09236a0eb8434a6c4520aa604f6afd472ee1e89034"} Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.029346 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b90cc371-f277-4e5c-9e95-5b8233d75503","Type":"ContainerDied","Data":"3188a7d9e90a3aa3411929beeb33e7c08c31a29bb3ddf556d15cc57bb0d96696"} Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.029378 4952 scope.go:117] "RemoveContainer" containerID="51d900984d182c81497d901cd7fe4146c0ead69e15ad2229678df2eb2b4ee2eb" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.037929 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.037952 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b90cc371-f277-4e5c-9e95-5b8233d75503-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.087822 4952 scope.go:117] "RemoveContainer" containerID="4563323ac5588f3f9f959c00bdc334a9fd0fdb751a15e8b2dadb669cb65144c8" Nov 22 03:11:55 crc kubenswrapper[4952]: E1122 03:11:55.122690 4952 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73e75a6d_eb3a_4e92_a13c_8ab87d2beb4f.slice\": RecentStats: unable to find data in memory cache]" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.193501 4952 scope.go:117] "RemoveContainer" containerID="719789bbe8a533e735cafc09236a0eb8434a6c4520aa604f6afd472ee1e89034" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.215181 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.216554 4952 scope.go:117] "RemoveContainer" containerID="a523636e7e2301571fa6df70c565c588832cbdf72da50d7ee998a2bf1a992aa4" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.228447 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.242922 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:55 crc kubenswrapper[4952]: E1122 03:11:55.243707 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3701508-1734-46b5-83f3-9f08f930b294" containerName="neutron-httpd" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.243930 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3701508-1734-46b5-83f3-9f08f930b294" containerName="neutron-httpd" Nov 22 03:11:55 crc kubenswrapper[4952]: E1122 03:11:55.244048 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca19638a-64bf-4f46-84d3-efd709c1593f" containerName="mariadb-account-create" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.244124 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca19638a-64bf-4f46-84d3-efd709c1593f" containerName="mariadb-account-create" Nov 22 03:11:55 crc kubenswrapper[4952]: E1122 03:11:55.244221 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74840955-d478-4dfb-a30d-9cff482b4e7e" containerName="mariadb-database-create" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.244324 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="74840955-d478-4dfb-a30d-9cff482b4e7e" containerName="mariadb-database-create" Nov 22 03:11:55 crc kubenswrapper[4952]: E1122 03:11:55.244408 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3701508-1734-46b5-83f3-9f08f930b294" containerName="neutron-api" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.244477 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3701508-1734-46b5-83f3-9f08f930b294" containerName="neutron-api" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.244147 4952 scope.go:117] "RemoveContainer" containerID="51d900984d182c81497d901cd7fe4146c0ead69e15ad2229678df2eb2b4ee2eb" Nov 22 03:11:55 crc kubenswrapper[4952]: E1122 03:11:55.244576 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f" containerName="mariadb-account-create" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.244752 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f" containerName="mariadb-account-create" Nov 22 03:11:55 crc kubenswrapper[4952]: E1122 03:11:55.244861 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b90cc371-f277-4e5c-9e95-5b8233d75503" containerName="ceilometer-notification-agent" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.244935 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90cc371-f277-4e5c-9e95-5b8233d75503" containerName="ceilometer-notification-agent" Nov 22 03:11:55 crc kubenswrapper[4952]: E1122 03:11:55.245008 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44db129c-1074-4f16-9c71-31a59ddd62a5" containerName="mariadb-database-create" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.245074 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="44db129c-1074-4f16-9c71-31a59ddd62a5" containerName="mariadb-database-create" Nov 22 03:11:55 crc kubenswrapper[4952]: E1122 03:11:55.245146 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b90cc371-f277-4e5c-9e95-5b8233d75503" containerName="ceilometer-central-agent" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.245215 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90cc371-f277-4e5c-9e95-5b8233d75503" containerName="ceilometer-central-agent" Nov 22 03:11:55 crc kubenswrapper[4952]: E1122 03:11:55.245294 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5cc4a94-12fd-42a3-b2cb-7d04163ca285" containerName="mariadb-database-create" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.245379 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5cc4a94-12fd-42a3-b2cb-7d04163ca285" containerName="mariadb-database-create" Nov 22 03:11:55 crc kubenswrapper[4952]: E1122 03:11:55.245489 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c" containerName="mariadb-account-create" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.245610 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c" containerName="mariadb-account-create" Nov 22 03:11:55 crc kubenswrapper[4952]: E1122 03:11:55.245739 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b90cc371-f277-4e5c-9e95-5b8233d75503" containerName="sg-core" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.245836 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90cc371-f277-4e5c-9e95-5b8233d75503" containerName="sg-core" Nov 22 03:11:55 crc kubenswrapper[4952]: E1122 03:11:55.245949 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b90cc371-f277-4e5c-9e95-5b8233d75503" containerName="proxy-httpd" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.246023 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90cc371-f277-4e5c-9e95-5b8233d75503" containerName="proxy-httpd" Nov 22 03:11:55 crc kubenswrapper[4952]: E1122 03:11:55.245144 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51d900984d182c81497d901cd7fe4146c0ead69e15ad2229678df2eb2b4ee2eb\": container with ID starting with 51d900984d182c81497d901cd7fe4146c0ead69e15ad2229678df2eb2b4ee2eb not found: ID does not exist" containerID="51d900984d182c81497d901cd7fe4146c0ead69e15ad2229678df2eb2b4ee2eb" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.246341 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51d900984d182c81497d901cd7fe4146c0ead69e15ad2229678df2eb2b4ee2eb"} err="failed to get container status \"51d900984d182c81497d901cd7fe4146c0ead69e15ad2229678df2eb2b4ee2eb\": rpc error: code = NotFound desc = could not find container \"51d900984d182c81497d901cd7fe4146c0ead69e15ad2229678df2eb2b4ee2eb\": container with ID starting with 51d900984d182c81497d901cd7fe4146c0ead69e15ad2229678df2eb2b4ee2eb not found: ID does not exist" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.246390 4952 scope.go:117] "RemoveContainer" containerID="4563323ac5588f3f9f959c00bdc334a9fd0fdb751a15e8b2dadb669cb65144c8" Nov 22 03:11:55 crc kubenswrapper[4952]: E1122 03:11:55.249231 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4563323ac5588f3f9f959c00bdc334a9fd0fdb751a15e8b2dadb669cb65144c8\": container with ID starting with 4563323ac5588f3f9f959c00bdc334a9fd0fdb751a15e8b2dadb669cb65144c8 not found: ID does not exist" containerID="4563323ac5588f3f9f959c00bdc334a9fd0fdb751a15e8b2dadb669cb65144c8" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.249279 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4563323ac5588f3f9f959c00bdc334a9fd0fdb751a15e8b2dadb669cb65144c8"} err="failed to get container status \"4563323ac5588f3f9f959c00bdc334a9fd0fdb751a15e8b2dadb669cb65144c8\": rpc error: code = NotFound desc = could not find container \"4563323ac5588f3f9f959c00bdc334a9fd0fdb751a15e8b2dadb669cb65144c8\": container with ID starting with 4563323ac5588f3f9f959c00bdc334a9fd0fdb751a15e8b2dadb669cb65144c8 not found: ID does not exist" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.249312 4952 scope.go:117] "RemoveContainer" containerID="719789bbe8a533e735cafc09236a0eb8434a6c4520aa604f6afd472ee1e89034" Nov 22 03:11:55 crc kubenswrapper[4952]: E1122 03:11:55.249869 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"719789bbe8a533e735cafc09236a0eb8434a6c4520aa604f6afd472ee1e89034\": container with ID starting with 719789bbe8a533e735cafc09236a0eb8434a6c4520aa604f6afd472ee1e89034 not found: ID does not exist" containerID="719789bbe8a533e735cafc09236a0eb8434a6c4520aa604f6afd472ee1e89034" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.249931 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"719789bbe8a533e735cafc09236a0eb8434a6c4520aa604f6afd472ee1e89034"} err="failed to get container status \"719789bbe8a533e735cafc09236a0eb8434a6c4520aa604f6afd472ee1e89034\": rpc error: code = NotFound desc = could not find container \"719789bbe8a533e735cafc09236a0eb8434a6c4520aa604f6afd472ee1e89034\": container with ID starting with 719789bbe8a533e735cafc09236a0eb8434a6c4520aa604f6afd472ee1e89034 not found: ID does not exist" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.249964 4952 scope.go:117] "RemoveContainer" containerID="a523636e7e2301571fa6df70c565c588832cbdf72da50d7ee998a2bf1a992aa4" Nov 22 03:11:55 crc kubenswrapper[4952]: E1122 03:11:55.250633 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a523636e7e2301571fa6df70c565c588832cbdf72da50d7ee998a2bf1a992aa4\": container with ID starting with a523636e7e2301571fa6df70c565c588832cbdf72da50d7ee998a2bf1a992aa4 not found: ID does not exist" containerID="a523636e7e2301571fa6df70c565c588832cbdf72da50d7ee998a2bf1a992aa4" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.250660 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a523636e7e2301571fa6df70c565c588832cbdf72da50d7ee998a2bf1a992aa4"} err="failed to get container status \"a523636e7e2301571fa6df70c565c588832cbdf72da50d7ee998a2bf1a992aa4\": rpc error: code = NotFound desc = could not find container \"a523636e7e2301571fa6df70c565c588832cbdf72da50d7ee998a2bf1a992aa4\": container with ID starting with a523636e7e2301571fa6df70c565c588832cbdf72da50d7ee998a2bf1a992aa4 not found: ID does not exist" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.252155 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="74840955-d478-4dfb-a30d-9cff482b4e7e" containerName="mariadb-database-create" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.252230 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c" containerName="mariadb-account-create" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.252279 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="b90cc371-f277-4e5c-9e95-5b8233d75503" containerName="ceilometer-notification-agent" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.252309 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="44db129c-1074-4f16-9c71-31a59ddd62a5" containerName="mariadb-database-create" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.252323 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3701508-1734-46b5-83f3-9f08f930b294" containerName="neutron-api" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.252335 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5cc4a94-12fd-42a3-b2cb-7d04163ca285" containerName="mariadb-database-create" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.252351 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="b90cc371-f277-4e5c-9e95-5b8233d75503" containerName="ceilometer-central-agent" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.252373 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="b90cc371-f277-4e5c-9e95-5b8233d75503" containerName="proxy-httpd" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.252387 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca19638a-64bf-4f46-84d3-efd709c1593f" containerName="mariadb-account-create" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.252402 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f" containerName="mariadb-account-create" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.252417 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3701508-1734-46b5-83f3-9f08f930b294" containerName="neutron-httpd" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.252437 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="b90cc371-f277-4e5c-9e95-5b8233d75503" containerName="sg-core" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.254638 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.257380 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.257627 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.262393 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.347678 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.347790 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-log-httpd\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.347825 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-scripts\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.347945 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.348359 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2v2w\" (UniqueName: \"kubernetes.io/projected/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-kube-api-access-q2v2w\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.348402 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-config-data\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.348470 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-run-httpd\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.450200 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2v2w\" (UniqueName: \"kubernetes.io/projected/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-kube-api-access-q2v2w\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.450269 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-config-data\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.450300 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-run-httpd\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.450373 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.450407 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-log-httpd\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.450422 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-scripts\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.450442 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.452429 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-log-httpd\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.452467 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-run-httpd\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.455027 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.455628 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-scripts\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.460234 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-config-data\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.481891 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.490619 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2v2w\" (UniqueName: \"kubernetes.io/projected/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-kube-api-access-q2v2w\") pod \"ceilometer-0\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " pod="openstack/ceilometer-0" Nov 22 03:11:55 crc kubenswrapper[4952]: I1122 03:11:55.594899 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:11:56 crc kubenswrapper[4952]: I1122 03:11:56.112855 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:56 crc kubenswrapper[4952]: W1122 03:11:56.113954 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78b02d40_e6f4_4bb7_8ab3_c23cc983eac9.slice/crio-f0fc8c476b7443d29e8b9d75e2b0473915adbfc1f6bd858d9ca1aa407ecb2760 WatchSource:0}: Error finding container f0fc8c476b7443d29e8b9d75e2b0473915adbfc1f6bd858d9ca1aa407ecb2760: Status 404 returned error can't find the container with id f0fc8c476b7443d29e8b9d75e2b0473915adbfc1f6bd858d9ca1aa407ecb2760 Nov 22 03:11:56 crc kubenswrapper[4952]: I1122 03:11:56.350071 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 03:11:56 crc kubenswrapper[4952]: I1122 03:11:56.350398 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="11ba6f7e-b307-4bd0-9f84-46bcf5721c38" containerName="kube-state-metrics" containerID="cri-o://cdaa3dbb4cca44eee3f134748c92d1936509b1b70485ce44397f68ee51eeb48b" gracePeriod=30 Nov 22 03:11:56 crc kubenswrapper[4952]: I1122 03:11:56.554743 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b90cc371-f277-4e5c-9e95-5b8233d75503" path="/var/lib/kubelet/pods/b90cc371-f277-4e5c-9e95-5b8233d75503/volumes" Nov 22 03:11:56 crc kubenswrapper[4952]: I1122 03:11:56.852758 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 03:11:56 crc kubenswrapper[4952]: I1122 03:11:56.886574 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmg8d\" (UniqueName: \"kubernetes.io/projected/11ba6f7e-b307-4bd0-9f84-46bcf5721c38-kube-api-access-pmg8d\") pod \"11ba6f7e-b307-4bd0-9f84-46bcf5721c38\" (UID: \"11ba6f7e-b307-4bd0-9f84-46bcf5721c38\") " Nov 22 03:11:56 crc kubenswrapper[4952]: I1122 03:11:56.897945 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11ba6f7e-b307-4bd0-9f84-46bcf5721c38-kube-api-access-pmg8d" (OuterVolumeSpecName: "kube-api-access-pmg8d") pod "11ba6f7e-b307-4bd0-9f84-46bcf5721c38" (UID: "11ba6f7e-b307-4bd0-9f84-46bcf5721c38"). InnerVolumeSpecName "kube-api-access-pmg8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:56 crc kubenswrapper[4952]: I1122 03:11:56.989205 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmg8d\" (UniqueName: \"kubernetes.io/projected/11ba6f7e-b307-4bd0-9f84-46bcf5721c38-kube-api-access-pmg8d\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.055679 4952 generic.go:334] "Generic (PLEG): container finished" podID="11ba6f7e-b307-4bd0-9f84-46bcf5721c38" containerID="cdaa3dbb4cca44eee3f134748c92d1936509b1b70485ce44397f68ee51eeb48b" exitCode=2 Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.055756 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"11ba6f7e-b307-4bd0-9f84-46bcf5721c38","Type":"ContainerDied","Data":"cdaa3dbb4cca44eee3f134748c92d1936509b1b70485ce44397f68ee51eeb48b"} Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.055779 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.056122 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"11ba6f7e-b307-4bd0-9f84-46bcf5721c38","Type":"ContainerDied","Data":"9710774eefc2dcbe8b9659ef22d1fa3dbb63633effe4955e710bfceec129f991"} Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.056180 4952 scope.go:117] "RemoveContainer" containerID="cdaa3dbb4cca44eee3f134748c92d1936509b1b70485ce44397f68ee51eeb48b" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.057509 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9","Type":"ContainerStarted","Data":"700e4f102cbc32039026463601f0600ba3d41e97a17aaebb207d2cc602540fd9"} Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.057637 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9","Type":"ContainerStarted","Data":"f0fc8c476b7443d29e8b9d75e2b0473915adbfc1f6bd858d9ca1aa407ecb2760"} Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.082726 4952 scope.go:117] "RemoveContainer" containerID="cdaa3dbb4cca44eee3f134748c92d1936509b1b70485ce44397f68ee51eeb48b" Nov 22 03:11:57 crc kubenswrapper[4952]: E1122 03:11:57.083959 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdaa3dbb4cca44eee3f134748c92d1936509b1b70485ce44397f68ee51eeb48b\": container with ID starting with cdaa3dbb4cca44eee3f134748c92d1936509b1b70485ce44397f68ee51eeb48b not found: ID does not exist" containerID="cdaa3dbb4cca44eee3f134748c92d1936509b1b70485ce44397f68ee51eeb48b" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.083995 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdaa3dbb4cca44eee3f134748c92d1936509b1b70485ce44397f68ee51eeb48b"} err="failed to get container status \"cdaa3dbb4cca44eee3f134748c92d1936509b1b70485ce44397f68ee51eeb48b\": rpc error: code = NotFound desc = could not find container \"cdaa3dbb4cca44eee3f134748c92d1936509b1b70485ce44397f68ee51eeb48b\": container with ID starting with cdaa3dbb4cca44eee3f134748c92d1936509b1b70485ce44397f68ee51eeb48b not found: ID does not exist" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.104605 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.114761 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.139956 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 03:11:57 crc kubenswrapper[4952]: E1122 03:11:57.140388 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ba6f7e-b307-4bd0-9f84-46bcf5721c38" containerName="kube-state-metrics" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.140408 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ba6f7e-b307-4bd0-9f84-46bcf5721c38" containerName="kube-state-metrics" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.140626 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="11ba6f7e-b307-4bd0-9f84-46bcf5721c38" containerName="kube-state-metrics" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.141340 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.154490 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.156195 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.165751 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.192324 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c8e5784-03cd-4a8a-9859-afe728764282-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5c8e5784-03cd-4a8a-9859-afe728764282\") " pod="openstack/kube-state-metrics-0" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.192390 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvt85\" (UniqueName: \"kubernetes.io/projected/5c8e5784-03cd-4a8a-9859-afe728764282-kube-api-access-mvt85\") pod \"kube-state-metrics-0\" (UID: \"5c8e5784-03cd-4a8a-9859-afe728764282\") " pod="openstack/kube-state-metrics-0" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.192523 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5c8e5784-03cd-4a8a-9859-afe728764282-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5c8e5784-03cd-4a8a-9859-afe728764282\") " pod="openstack/kube-state-metrics-0" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.192644 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8e5784-03cd-4a8a-9859-afe728764282-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5c8e5784-03cd-4a8a-9859-afe728764282\") " pod="openstack/kube-state-metrics-0" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.294506 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c8e5784-03cd-4a8a-9859-afe728764282-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5c8e5784-03cd-4a8a-9859-afe728764282\") " pod="openstack/kube-state-metrics-0" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.294588 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvt85\" (UniqueName: \"kubernetes.io/projected/5c8e5784-03cd-4a8a-9859-afe728764282-kube-api-access-mvt85\") pod \"kube-state-metrics-0\" (UID: \"5c8e5784-03cd-4a8a-9859-afe728764282\") " pod="openstack/kube-state-metrics-0" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.294664 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5c8e5784-03cd-4a8a-9859-afe728764282-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5c8e5784-03cd-4a8a-9859-afe728764282\") " pod="openstack/kube-state-metrics-0" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.294694 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8e5784-03cd-4a8a-9859-afe728764282-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5c8e5784-03cd-4a8a-9859-afe728764282\") " pod="openstack/kube-state-metrics-0" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.300035 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8e5784-03cd-4a8a-9859-afe728764282-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5c8e5784-03cd-4a8a-9859-afe728764282\") " pod="openstack/kube-state-metrics-0" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.303986 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5c8e5784-03cd-4a8a-9859-afe728764282-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5c8e5784-03cd-4a8a-9859-afe728764282\") " pod="openstack/kube-state-metrics-0" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.307160 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c8e5784-03cd-4a8a-9859-afe728764282-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5c8e5784-03cd-4a8a-9859-afe728764282\") " pod="openstack/kube-state-metrics-0" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.317029 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvt85\" (UniqueName: \"kubernetes.io/projected/5c8e5784-03cd-4a8a-9859-afe728764282-kube-api-access-mvt85\") pod \"kube-state-metrics-0\" (UID: \"5c8e5784-03cd-4a8a-9859-afe728764282\") " pod="openstack/kube-state-metrics-0" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.501611 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 03:11:57 crc kubenswrapper[4952]: I1122 03:11:57.672388 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:58 crc kubenswrapper[4952]: I1122 03:11:58.002353 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 03:11:58 crc kubenswrapper[4952]: I1122 03:11:58.067393 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5c8e5784-03cd-4a8a-9859-afe728764282","Type":"ContainerStarted","Data":"b370cce701a2ad31cc9db8791b31f2a00f9dd592651c51c09345932bb1939c35"} Nov 22 03:11:58 crc kubenswrapper[4952]: I1122 03:11:58.069257 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9","Type":"ContainerStarted","Data":"4c5bfb80ca226383c1ca6da6aa4169a3118ded2f48abcaba2da0469f20d8daa4"} Nov 22 03:11:58 crc kubenswrapper[4952]: I1122 03:11:58.342119 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:11:58 crc kubenswrapper[4952]: I1122 03:11:58.342699 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:11:58 crc kubenswrapper[4952]: I1122 03:11:58.544174 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11ba6f7e-b307-4bd0-9f84-46bcf5721c38" path="/var/lib/kubelet/pods/11ba6f7e-b307-4bd0-9f84-46bcf5721c38/volumes" Nov 22 03:11:59 crc kubenswrapper[4952]: I1122 03:11:59.087878 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9","Type":"ContainerStarted","Data":"625e5ab4e4a4591d7682d5f1ceb37ca76bed2e6ae0d90e07d5f48a0b9357dac8"} Nov 22 03:11:59 crc kubenswrapper[4952]: I1122 03:11:59.091293 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5c8e5784-03cd-4a8a-9859-afe728764282","Type":"ContainerStarted","Data":"8474b3cddf9d8a7d783f317260cf5bdcf81efd49caaab900fd054af04e0c6092"} Nov 22 03:11:59 crc kubenswrapper[4952]: I1122 03:11:59.091493 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 22 03:11:59 crc kubenswrapper[4952]: I1122 03:11:59.118263 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.7692959 podStartE2EDuration="2.118230373s" podCreationTimestamp="2025-11-22 03:11:57 +0000 UTC" firstStartedPulling="2025-11-22 03:11:58.018110068 +0000 UTC m=+1082.324127341" lastFinishedPulling="2025-11-22 03:11:58.367044541 +0000 UTC m=+1082.673061814" observedRunningTime="2025-11-22 03:11:59.108653285 +0000 UTC m=+1083.414670558" watchObservedRunningTime="2025-11-22 03:11:59.118230373 +0000 UTC m=+1083.424247646" Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.103848 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9","Type":"ContainerStarted","Data":"f0b1e5342af38165733eb20d300950f6762776561aac2244fae4b78b91bb6b8e"} Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.104443 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.104070 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" containerName="proxy-httpd" containerID="cri-o://f0b1e5342af38165733eb20d300950f6762776561aac2244fae4b78b91bb6b8e" gracePeriod=30 Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.104040 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" containerName="ceilometer-central-agent" containerID="cri-o://700e4f102cbc32039026463601f0600ba3d41e97a17aaebb207d2cc602540fd9" gracePeriod=30 Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.104196 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" containerName="ceilometer-notification-agent" containerID="cri-o://4c5bfb80ca226383c1ca6da6aa4169a3118ded2f48abcaba2da0469f20d8daa4" gracePeriod=30 Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.104174 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" containerName="sg-core" containerID="cri-o://625e5ab4e4a4591d7682d5f1ceb37ca76bed2e6ae0d90e07d5f48a0b9357dac8" gracePeriod=30 Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.138807 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.695468976 podStartE2EDuration="5.138779114s" podCreationTimestamp="2025-11-22 03:11:55 +0000 UTC" firstStartedPulling="2025-11-22 03:11:56.116963817 +0000 UTC m=+1080.422981100" lastFinishedPulling="2025-11-22 03:11:59.560273965 +0000 UTC m=+1083.866291238" observedRunningTime="2025-11-22 03:12:00.132436264 +0000 UTC m=+1084.438453527" watchObservedRunningTime="2025-11-22 03:12:00.138779114 +0000 UTC m=+1084.444796387" Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.730140 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rxtxb"] Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.731792 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rxtxb" Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.746359 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-8hwjw" Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.747067 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.759305 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rxtxb"] Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.776155 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.782191 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff8e549-6708-4b20-acc9-411cf736985a-scripts\") pod \"nova-cell0-conductor-db-sync-rxtxb\" (UID: \"dff8e549-6708-4b20-acc9-411cf736985a\") " pod="openstack/nova-cell0-conductor-db-sync-rxtxb" Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.782390 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff8e549-6708-4b20-acc9-411cf736985a-config-data\") pod \"nova-cell0-conductor-db-sync-rxtxb\" (UID: \"dff8e549-6708-4b20-acc9-411cf736985a\") " pod="openstack/nova-cell0-conductor-db-sync-rxtxb" Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.782466 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw6zp\" (UniqueName: \"kubernetes.io/projected/dff8e549-6708-4b20-acc9-411cf736985a-kube-api-access-pw6zp\") pod \"nova-cell0-conductor-db-sync-rxtxb\" (UID: \"dff8e549-6708-4b20-acc9-411cf736985a\") " pod="openstack/nova-cell0-conductor-db-sync-rxtxb" Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.782525 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff8e549-6708-4b20-acc9-411cf736985a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rxtxb\" (UID: \"dff8e549-6708-4b20-acc9-411cf736985a\") " pod="openstack/nova-cell0-conductor-db-sync-rxtxb" Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.888989 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff8e549-6708-4b20-acc9-411cf736985a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rxtxb\" (UID: \"dff8e549-6708-4b20-acc9-411cf736985a\") " pod="openstack/nova-cell0-conductor-db-sync-rxtxb" Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.889412 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff8e549-6708-4b20-acc9-411cf736985a-scripts\") pod \"nova-cell0-conductor-db-sync-rxtxb\" (UID: \"dff8e549-6708-4b20-acc9-411cf736985a\") " pod="openstack/nova-cell0-conductor-db-sync-rxtxb" Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.889520 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff8e549-6708-4b20-acc9-411cf736985a-config-data\") pod \"nova-cell0-conductor-db-sync-rxtxb\" (UID: \"dff8e549-6708-4b20-acc9-411cf736985a\") " pod="openstack/nova-cell0-conductor-db-sync-rxtxb" Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.889592 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw6zp\" (UniqueName: \"kubernetes.io/projected/dff8e549-6708-4b20-acc9-411cf736985a-kube-api-access-pw6zp\") pod \"nova-cell0-conductor-db-sync-rxtxb\" (UID: \"dff8e549-6708-4b20-acc9-411cf736985a\") " pod="openstack/nova-cell0-conductor-db-sync-rxtxb" Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.896835 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff8e549-6708-4b20-acc9-411cf736985a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rxtxb\" (UID: \"dff8e549-6708-4b20-acc9-411cf736985a\") " pod="openstack/nova-cell0-conductor-db-sync-rxtxb" Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.896889 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff8e549-6708-4b20-acc9-411cf736985a-scripts\") pod \"nova-cell0-conductor-db-sync-rxtxb\" (UID: \"dff8e549-6708-4b20-acc9-411cf736985a\") " pod="openstack/nova-cell0-conductor-db-sync-rxtxb" Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.899111 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff8e549-6708-4b20-acc9-411cf736985a-config-data\") pod \"nova-cell0-conductor-db-sync-rxtxb\" (UID: \"dff8e549-6708-4b20-acc9-411cf736985a\") " pod="openstack/nova-cell0-conductor-db-sync-rxtxb" Nov 22 03:12:00 crc kubenswrapper[4952]: I1122 03:12:00.907391 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw6zp\" (UniqueName: \"kubernetes.io/projected/dff8e549-6708-4b20-acc9-411cf736985a-kube-api-access-pw6zp\") pod \"nova-cell0-conductor-db-sync-rxtxb\" (UID: \"dff8e549-6708-4b20-acc9-411cf736985a\") " pod="openstack/nova-cell0-conductor-db-sync-rxtxb" Nov 22 03:12:01 crc kubenswrapper[4952]: I1122 03:12:01.104560 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rxtxb" Nov 22 03:12:01 crc kubenswrapper[4952]: I1122 03:12:01.119696 4952 generic.go:334] "Generic (PLEG): container finished" podID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" containerID="f0b1e5342af38165733eb20d300950f6762776561aac2244fae4b78b91bb6b8e" exitCode=0 Nov 22 03:12:01 crc kubenswrapper[4952]: I1122 03:12:01.119744 4952 generic.go:334] "Generic (PLEG): container finished" podID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" containerID="625e5ab4e4a4591d7682d5f1ceb37ca76bed2e6ae0d90e07d5f48a0b9357dac8" exitCode=2 Nov 22 03:12:01 crc kubenswrapper[4952]: I1122 03:12:01.119754 4952 generic.go:334] "Generic (PLEG): container finished" podID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" containerID="4c5bfb80ca226383c1ca6da6aa4169a3118ded2f48abcaba2da0469f20d8daa4" exitCode=0 Nov 22 03:12:01 crc kubenswrapper[4952]: I1122 03:12:01.119777 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9","Type":"ContainerDied","Data":"f0b1e5342af38165733eb20d300950f6762776561aac2244fae4b78b91bb6b8e"} Nov 22 03:12:01 crc kubenswrapper[4952]: I1122 03:12:01.119828 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9","Type":"ContainerDied","Data":"625e5ab4e4a4591d7682d5f1ceb37ca76bed2e6ae0d90e07d5f48a0b9357dac8"} Nov 22 03:12:01 crc kubenswrapper[4952]: I1122 03:12:01.119838 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9","Type":"ContainerDied","Data":"4c5bfb80ca226383c1ca6da6aa4169a3118ded2f48abcaba2da0469f20d8daa4"} Nov 22 03:12:01 crc kubenswrapper[4952]: I1122 03:12:01.581622 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rxtxb"] Nov 22 03:12:02 crc kubenswrapper[4952]: I1122 03:12:02.133139 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rxtxb" event={"ID":"dff8e549-6708-4b20-acc9-411cf736985a","Type":"ContainerStarted","Data":"18464b662f147a6d872e4ea23bf73619870959dc6af99846a87d988b84266c53"} Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.143672 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.163603 4952 generic.go:334] "Generic (PLEG): container finished" podID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" containerID="700e4f102cbc32039026463601f0600ba3d41e97a17aaebb207d2cc602540fd9" exitCode=0 Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.163668 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9","Type":"ContainerDied","Data":"700e4f102cbc32039026463601f0600ba3d41e97a17aaebb207d2cc602540fd9"} Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.163706 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9","Type":"ContainerDied","Data":"f0fc8c476b7443d29e8b9d75e2b0473915adbfc1f6bd858d9ca1aa407ecb2760"} Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.163730 4952 scope.go:117] "RemoveContainer" containerID="f0b1e5342af38165733eb20d300950f6762776561aac2244fae4b78b91bb6b8e" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.163924 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.196103 4952 scope.go:117] "RemoveContainer" containerID="625e5ab4e4a4591d7682d5f1ceb37ca76bed2e6ae0d90e07d5f48a0b9357dac8" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.226786 4952 scope.go:117] "RemoveContainer" containerID="4c5bfb80ca226383c1ca6da6aa4169a3118ded2f48abcaba2da0469f20d8daa4" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.258310 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-config-data\") pod \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.258421 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-sg-core-conf-yaml\") pod \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.258529 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-run-httpd\") pod \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.258601 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2v2w\" (UniqueName: \"kubernetes.io/projected/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-kube-api-access-q2v2w\") pod \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.258728 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-scripts\") pod \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.258759 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-combined-ca-bundle\") pod \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.258794 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-log-httpd\") pod \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\" (UID: \"78b02d40-e6f4-4bb7-8ab3-c23cc983eac9\") " Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.259496 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" (UID: "78b02d40-e6f4-4bb7-8ab3-c23cc983eac9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.260229 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" (UID: "78b02d40-e6f4-4bb7-8ab3-c23cc983eac9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.264092 4952 scope.go:117] "RemoveContainer" containerID="700e4f102cbc32039026463601f0600ba3d41e97a17aaebb207d2cc602540fd9" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.266893 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-kube-api-access-q2v2w" (OuterVolumeSpecName: "kube-api-access-q2v2w") pod "78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" (UID: "78b02d40-e6f4-4bb7-8ab3-c23cc983eac9"). InnerVolumeSpecName "kube-api-access-q2v2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.270653 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-scripts" (OuterVolumeSpecName: "scripts") pod "78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" (UID: "78b02d40-e6f4-4bb7-8ab3-c23cc983eac9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.295772 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" (UID: "78b02d40-e6f4-4bb7-8ab3-c23cc983eac9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.352938 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" (UID: "78b02d40-e6f4-4bb7-8ab3-c23cc983eac9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.361108 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.361145 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.361158 4952 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.361167 4952 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.361176 4952 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.361185 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2v2w\" (UniqueName: \"kubernetes.io/projected/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-kube-api-access-q2v2w\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.375676 4952 scope.go:117] "RemoveContainer" containerID="f0b1e5342af38165733eb20d300950f6762776561aac2244fae4b78b91bb6b8e" Nov 22 03:12:04 crc kubenswrapper[4952]: E1122 03:12:04.376327 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0b1e5342af38165733eb20d300950f6762776561aac2244fae4b78b91bb6b8e\": container with ID starting with f0b1e5342af38165733eb20d300950f6762776561aac2244fae4b78b91bb6b8e not found: ID does not exist" containerID="f0b1e5342af38165733eb20d300950f6762776561aac2244fae4b78b91bb6b8e" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.376392 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0b1e5342af38165733eb20d300950f6762776561aac2244fae4b78b91bb6b8e"} err="failed to get container status \"f0b1e5342af38165733eb20d300950f6762776561aac2244fae4b78b91bb6b8e\": rpc error: code = NotFound desc = could not find container \"f0b1e5342af38165733eb20d300950f6762776561aac2244fae4b78b91bb6b8e\": container with ID starting with f0b1e5342af38165733eb20d300950f6762776561aac2244fae4b78b91bb6b8e not found: ID does not exist" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.376424 4952 scope.go:117] "RemoveContainer" containerID="625e5ab4e4a4591d7682d5f1ceb37ca76bed2e6ae0d90e07d5f48a0b9357dac8" Nov 22 03:12:04 crc kubenswrapper[4952]: E1122 03:12:04.377041 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"625e5ab4e4a4591d7682d5f1ceb37ca76bed2e6ae0d90e07d5f48a0b9357dac8\": container with ID starting with 625e5ab4e4a4591d7682d5f1ceb37ca76bed2e6ae0d90e07d5f48a0b9357dac8 not found: ID does not exist" containerID="625e5ab4e4a4591d7682d5f1ceb37ca76bed2e6ae0d90e07d5f48a0b9357dac8" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.377096 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625e5ab4e4a4591d7682d5f1ceb37ca76bed2e6ae0d90e07d5f48a0b9357dac8"} err="failed to get container status \"625e5ab4e4a4591d7682d5f1ceb37ca76bed2e6ae0d90e07d5f48a0b9357dac8\": rpc error: code = NotFound desc = could not find container \"625e5ab4e4a4591d7682d5f1ceb37ca76bed2e6ae0d90e07d5f48a0b9357dac8\": container with ID starting with 625e5ab4e4a4591d7682d5f1ceb37ca76bed2e6ae0d90e07d5f48a0b9357dac8 not found: ID does not exist" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.377120 4952 scope.go:117] "RemoveContainer" containerID="4c5bfb80ca226383c1ca6da6aa4169a3118ded2f48abcaba2da0469f20d8daa4" Nov 22 03:12:04 crc kubenswrapper[4952]: E1122 03:12:04.377481 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c5bfb80ca226383c1ca6da6aa4169a3118ded2f48abcaba2da0469f20d8daa4\": container with ID starting with 4c5bfb80ca226383c1ca6da6aa4169a3118ded2f48abcaba2da0469f20d8daa4 not found: ID does not exist" containerID="4c5bfb80ca226383c1ca6da6aa4169a3118ded2f48abcaba2da0469f20d8daa4" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.377516 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c5bfb80ca226383c1ca6da6aa4169a3118ded2f48abcaba2da0469f20d8daa4"} err="failed to get container status \"4c5bfb80ca226383c1ca6da6aa4169a3118ded2f48abcaba2da0469f20d8daa4\": rpc error: code = NotFound desc = could not find container \"4c5bfb80ca226383c1ca6da6aa4169a3118ded2f48abcaba2da0469f20d8daa4\": container with ID starting with 4c5bfb80ca226383c1ca6da6aa4169a3118ded2f48abcaba2da0469f20d8daa4 not found: ID does not exist" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.377613 4952 scope.go:117] "RemoveContainer" containerID="700e4f102cbc32039026463601f0600ba3d41e97a17aaebb207d2cc602540fd9" Nov 22 03:12:04 crc kubenswrapper[4952]: E1122 03:12:04.377929 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"700e4f102cbc32039026463601f0600ba3d41e97a17aaebb207d2cc602540fd9\": container with ID starting with 700e4f102cbc32039026463601f0600ba3d41e97a17aaebb207d2cc602540fd9 not found: ID does not exist" containerID="700e4f102cbc32039026463601f0600ba3d41e97a17aaebb207d2cc602540fd9" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.377969 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"700e4f102cbc32039026463601f0600ba3d41e97a17aaebb207d2cc602540fd9"} err="failed to get container status \"700e4f102cbc32039026463601f0600ba3d41e97a17aaebb207d2cc602540fd9\": rpc error: code = NotFound desc = could not find container \"700e4f102cbc32039026463601f0600ba3d41e97a17aaebb207d2cc602540fd9\": container with ID starting with 700e4f102cbc32039026463601f0600ba3d41e97a17aaebb207d2cc602540fd9 not found: ID does not exist" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.381722 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-config-data" (OuterVolumeSpecName: "config-data") pod "78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" (UID: "78b02d40-e6f4-4bb7-8ab3-c23cc983eac9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.462697 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.509091 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.518238 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.554603 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" path="/var/lib/kubelet/pods/78b02d40-e6f4-4bb7-8ab3-c23cc983eac9/volumes" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.599872 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:04 crc kubenswrapper[4952]: E1122 03:12:04.601345 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" containerName="proxy-httpd" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.601364 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" containerName="proxy-httpd" Nov 22 03:12:04 crc kubenswrapper[4952]: E1122 03:12:04.601378 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" containerName="sg-core" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.601391 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" containerName="sg-core" Nov 22 03:12:04 crc kubenswrapper[4952]: E1122 03:12:04.601420 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" containerName="ceilometer-notification-agent" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.601428 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" containerName="ceilometer-notification-agent" Nov 22 03:12:04 crc kubenswrapper[4952]: E1122 03:12:04.601468 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" containerName="ceilometer-central-agent" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.601476 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" containerName="ceilometer-central-agent" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.601990 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" containerName="ceilometer-central-agent" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.602011 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" containerName="sg-core" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.602041 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" containerName="proxy-httpd" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.602053 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b02d40-e6f4-4bb7-8ab3-c23cc983eac9" containerName="ceilometer-notification-agent" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.610773 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.616922 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.618732 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.619813 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.621260 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.687381 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-config-data\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.687467 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33ee5d76-294e-418f-814d-d47f30cf8b16-log-httpd\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.687732 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.687919 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.687969 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-scripts\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.688032 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.688072 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjpdt\" (UniqueName: \"kubernetes.io/projected/33ee5d76-294e-418f-814d-d47f30cf8b16-kube-api-access-wjpdt\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.688107 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33ee5d76-294e-418f-814d-d47f30cf8b16-run-httpd\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.791740 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-config-data\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.792336 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33ee5d76-294e-418f-814d-d47f30cf8b16-log-httpd\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.792752 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33ee5d76-294e-418f-814d-d47f30cf8b16-log-httpd\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.792909 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.792978 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.793005 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-scripts\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.793032 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.793059 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjpdt\" (UniqueName: \"kubernetes.io/projected/33ee5d76-294e-418f-814d-d47f30cf8b16-kube-api-access-wjpdt\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.793087 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33ee5d76-294e-418f-814d-d47f30cf8b16-run-httpd\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.793345 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33ee5d76-294e-418f-814d-d47f30cf8b16-run-httpd\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.795992 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-config-data\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.802819 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.803303 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.804640 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-scripts\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.827433 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.836364 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjpdt\" (UniqueName: \"kubernetes.io/projected/33ee5d76-294e-418f-814d-d47f30cf8b16-kube-api-access-wjpdt\") pod \"ceilometer-0\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " pod="openstack/ceilometer-0" Nov 22 03:12:04 crc kubenswrapper[4952]: I1122 03:12:04.982787 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:12:05 crc kubenswrapper[4952]: I1122 03:12:05.505393 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:05 crc kubenswrapper[4952]: I1122 03:12:05.892175 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:07 crc kubenswrapper[4952]: I1122 03:12:07.519110 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 22 03:12:10 crc kubenswrapper[4952]: I1122 03:12:10.262344 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33ee5d76-294e-418f-814d-d47f30cf8b16","Type":"ContainerStarted","Data":"278f38adee9f3fcaec840756189a58c7ed45da984d956287e73c2433569177fa"} Nov 22 03:12:11 crc kubenswrapper[4952]: I1122 03:12:11.280611 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33ee5d76-294e-418f-814d-d47f30cf8b16","Type":"ContainerStarted","Data":"2267fd930845ffd8d1a14b5f061e1287e841e0bc24750f7570b3b8ed7410368f"} Nov 22 03:12:11 crc kubenswrapper[4952]: I1122 03:12:11.283883 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rxtxb" event={"ID":"dff8e549-6708-4b20-acc9-411cf736985a","Type":"ContainerStarted","Data":"b0bc29ddcfabb4faf17c7ef461f2adc42889ee8b2d974dc2d1ea21a1a9a908e4"} Nov 22 03:12:11 crc kubenswrapper[4952]: I1122 03:12:11.307911 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-rxtxb" podStartSLOduration=2.291092454 podStartE2EDuration="11.307881234s" podCreationTimestamp="2025-11-22 03:12:00 +0000 UTC" firstStartedPulling="2025-11-22 03:12:01.5917921 +0000 UTC m=+1085.897809373" lastFinishedPulling="2025-11-22 03:12:10.60858088 +0000 UTC m=+1094.914598153" observedRunningTime="2025-11-22 03:12:11.307051782 +0000 UTC m=+1095.613069055" watchObservedRunningTime="2025-11-22 03:12:11.307881234 +0000 UTC m=+1095.613898507" Nov 22 03:12:12 crc kubenswrapper[4952]: I1122 03:12:12.303056 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33ee5d76-294e-418f-814d-d47f30cf8b16","Type":"ContainerStarted","Data":"c7f50c7374868506e18b188906a8af5cdaa9fec349ec6bd867a26c80369bcb76"} Nov 22 03:12:13 crc kubenswrapper[4952]: I1122 03:12:13.314341 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33ee5d76-294e-418f-814d-d47f30cf8b16","Type":"ContainerStarted","Data":"5d065045115d8c98abe0ff39c52373d088b79adbca15c5dedd4fb02191dd25cd"} Nov 22 03:12:16 crc kubenswrapper[4952]: I1122 03:12:16.351981 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33ee5d76-294e-418f-814d-d47f30cf8b16","Type":"ContainerStarted","Data":"6cc40601b5d19bf9ec48c035f909fb3b9807595738ade2d1a024d8e2dbe998a9"} Nov 22 03:12:16 crc kubenswrapper[4952]: I1122 03:12:16.352952 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 03:12:16 crc kubenswrapper[4952]: I1122 03:12:16.352180 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33ee5d76-294e-418f-814d-d47f30cf8b16" containerName="proxy-httpd" containerID="cri-o://6cc40601b5d19bf9ec48c035f909fb3b9807595738ade2d1a024d8e2dbe998a9" gracePeriod=30 Nov 22 03:12:16 crc kubenswrapper[4952]: I1122 03:12:16.352139 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33ee5d76-294e-418f-814d-d47f30cf8b16" containerName="ceilometer-central-agent" containerID="cri-o://2267fd930845ffd8d1a14b5f061e1287e841e0bc24750f7570b3b8ed7410368f" gracePeriod=30 Nov 22 03:12:16 crc kubenswrapper[4952]: I1122 03:12:16.352238 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33ee5d76-294e-418f-814d-d47f30cf8b16" containerName="sg-core" containerID="cri-o://5d065045115d8c98abe0ff39c52373d088b79adbca15c5dedd4fb02191dd25cd" gracePeriod=30 Nov 22 03:12:16 crc kubenswrapper[4952]: I1122 03:12:16.352297 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33ee5d76-294e-418f-814d-d47f30cf8b16" containerName="ceilometer-notification-agent" containerID="cri-o://c7f50c7374868506e18b188906a8af5cdaa9fec349ec6bd867a26c80369bcb76" gracePeriod=30 Nov 22 03:12:16 crc kubenswrapper[4952]: I1122 03:12:16.406533 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.752362242 podStartE2EDuration="12.40650388s" podCreationTimestamp="2025-11-22 03:12:04 +0000 UTC" firstStartedPulling="2025-11-22 03:12:10.539427326 +0000 UTC m=+1094.845444599" lastFinishedPulling="2025-11-22 03:12:15.193568954 +0000 UTC m=+1099.499586237" observedRunningTime="2025-11-22 03:12:16.388450943 +0000 UTC m=+1100.694468226" watchObservedRunningTime="2025-11-22 03:12:16.40650388 +0000 UTC m=+1100.712521153" Nov 22 03:12:17 crc kubenswrapper[4952]: I1122 03:12:17.363795 4952 generic.go:334] "Generic (PLEG): container finished" podID="33ee5d76-294e-418f-814d-d47f30cf8b16" containerID="6cc40601b5d19bf9ec48c035f909fb3b9807595738ade2d1a024d8e2dbe998a9" exitCode=0 Nov 22 03:12:17 crc kubenswrapper[4952]: I1122 03:12:17.363836 4952 generic.go:334] "Generic (PLEG): container finished" podID="33ee5d76-294e-418f-814d-d47f30cf8b16" containerID="5d065045115d8c98abe0ff39c52373d088b79adbca15c5dedd4fb02191dd25cd" exitCode=2 Nov 22 03:12:17 crc kubenswrapper[4952]: I1122 03:12:17.363846 4952 generic.go:334] "Generic (PLEG): container finished" podID="33ee5d76-294e-418f-814d-d47f30cf8b16" containerID="c7f50c7374868506e18b188906a8af5cdaa9fec349ec6bd867a26c80369bcb76" exitCode=0 Nov 22 03:12:17 crc kubenswrapper[4952]: I1122 03:12:17.363899 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33ee5d76-294e-418f-814d-d47f30cf8b16","Type":"ContainerDied","Data":"6cc40601b5d19bf9ec48c035f909fb3b9807595738ade2d1a024d8e2dbe998a9"} Nov 22 03:12:17 crc kubenswrapper[4952]: I1122 03:12:17.363978 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33ee5d76-294e-418f-814d-d47f30cf8b16","Type":"ContainerDied","Data":"5d065045115d8c98abe0ff39c52373d088b79adbca15c5dedd4fb02191dd25cd"} Nov 22 03:12:17 crc kubenswrapper[4952]: I1122 03:12:17.363993 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33ee5d76-294e-418f-814d-d47f30cf8b16","Type":"ContainerDied","Data":"c7f50c7374868506e18b188906a8af5cdaa9fec349ec6bd867a26c80369bcb76"} Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.377349 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.377891 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33ee5d76-294e-418f-814d-d47f30cf8b16","Type":"ContainerDied","Data":"2267fd930845ffd8d1a14b5f061e1287e841e0bc24750f7570b3b8ed7410368f"} Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.377838 4952 generic.go:334] "Generic (PLEG): container finished" podID="33ee5d76-294e-418f-814d-d47f30cf8b16" containerID="2267fd930845ffd8d1a14b5f061e1287e841e0bc24750f7570b3b8ed7410368f" exitCode=0 Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.377968 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33ee5d76-294e-418f-814d-d47f30cf8b16","Type":"ContainerDied","Data":"278f38adee9f3fcaec840756189a58c7ed45da984d956287e73c2433569177fa"} Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.377981 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="278f38adee9f3fcaec840756189a58c7ed45da984d956287e73c2433569177fa" Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.516740 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjpdt\" (UniqueName: \"kubernetes.io/projected/33ee5d76-294e-418f-814d-d47f30cf8b16-kube-api-access-wjpdt\") pod \"33ee5d76-294e-418f-814d-d47f30cf8b16\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.516903 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-scripts\") pod \"33ee5d76-294e-418f-814d-d47f30cf8b16\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.516935 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33ee5d76-294e-418f-814d-d47f30cf8b16-run-httpd\") pod \"33ee5d76-294e-418f-814d-d47f30cf8b16\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.517007 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-ceilometer-tls-certs\") pod \"33ee5d76-294e-418f-814d-d47f30cf8b16\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.517136 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-sg-core-conf-yaml\") pod \"33ee5d76-294e-418f-814d-d47f30cf8b16\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.517159 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-combined-ca-bundle\") pod \"33ee5d76-294e-418f-814d-d47f30cf8b16\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.517185 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33ee5d76-294e-418f-814d-d47f30cf8b16-log-httpd\") pod \"33ee5d76-294e-418f-814d-d47f30cf8b16\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.517204 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-config-data\") pod \"33ee5d76-294e-418f-814d-d47f30cf8b16\" (UID: \"33ee5d76-294e-418f-814d-d47f30cf8b16\") " Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.517619 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33ee5d76-294e-418f-814d-d47f30cf8b16-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "33ee5d76-294e-418f-814d-d47f30cf8b16" (UID: "33ee5d76-294e-418f-814d-d47f30cf8b16"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.517759 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33ee5d76-294e-418f-814d-d47f30cf8b16-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "33ee5d76-294e-418f-814d-d47f30cf8b16" (UID: "33ee5d76-294e-418f-814d-d47f30cf8b16"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.551847 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-scripts" (OuterVolumeSpecName: "scripts") pod "33ee5d76-294e-418f-814d-d47f30cf8b16" (UID: "33ee5d76-294e-418f-814d-d47f30cf8b16"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.583997 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ee5d76-294e-418f-814d-d47f30cf8b16-kube-api-access-wjpdt" (OuterVolumeSpecName: "kube-api-access-wjpdt") pod "33ee5d76-294e-418f-814d-d47f30cf8b16" (UID: "33ee5d76-294e-418f-814d-d47f30cf8b16"). InnerVolumeSpecName "kube-api-access-wjpdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.619413 4952 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33ee5d76-294e-418f-814d-d47f30cf8b16-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.619463 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjpdt\" (UniqueName: \"kubernetes.io/projected/33ee5d76-294e-418f-814d-d47f30cf8b16-kube-api-access-wjpdt\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.619476 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.619485 4952 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33ee5d76-294e-418f-814d-d47f30cf8b16-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.622840 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "33ee5d76-294e-418f-814d-d47f30cf8b16" (UID: "33ee5d76-294e-418f-814d-d47f30cf8b16"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.670566 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "33ee5d76-294e-418f-814d-d47f30cf8b16" (UID: "33ee5d76-294e-418f-814d-d47f30cf8b16"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.674011 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33ee5d76-294e-418f-814d-d47f30cf8b16" (UID: "33ee5d76-294e-418f-814d-d47f30cf8b16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.708065 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-config-data" (OuterVolumeSpecName: "config-data") pod "33ee5d76-294e-418f-814d-d47f30cf8b16" (UID: "33ee5d76-294e-418f-814d-d47f30cf8b16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.721675 4952 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.721720 4952 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.721736 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:18 crc kubenswrapper[4952]: I1122 03:12:18.721750 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ee5d76-294e-418f-814d-d47f30cf8b16-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.390573 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.435851 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.446631 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.514607 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:19 crc kubenswrapper[4952]: E1122 03:12:19.515320 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ee5d76-294e-418f-814d-d47f30cf8b16" containerName="ceilometer-notification-agent" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.515341 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ee5d76-294e-418f-814d-d47f30cf8b16" containerName="ceilometer-notification-agent" Nov 22 03:12:19 crc kubenswrapper[4952]: E1122 03:12:19.515378 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ee5d76-294e-418f-814d-d47f30cf8b16" containerName="ceilometer-central-agent" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.515388 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ee5d76-294e-418f-814d-d47f30cf8b16" containerName="ceilometer-central-agent" Nov 22 03:12:19 crc kubenswrapper[4952]: E1122 03:12:19.515413 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ee5d76-294e-418f-814d-d47f30cf8b16" containerName="proxy-httpd" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.515421 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ee5d76-294e-418f-814d-d47f30cf8b16" containerName="proxy-httpd" Nov 22 03:12:19 crc kubenswrapper[4952]: E1122 03:12:19.515438 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ee5d76-294e-418f-814d-d47f30cf8b16" containerName="sg-core" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.515445 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ee5d76-294e-418f-814d-d47f30cf8b16" containerName="sg-core" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.515684 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ee5d76-294e-418f-814d-d47f30cf8b16" containerName="ceilometer-notification-agent" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.516048 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ee5d76-294e-418f-814d-d47f30cf8b16" containerName="ceilometer-central-agent" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.516064 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ee5d76-294e-418f-814d-d47f30cf8b16" containerName="sg-core" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.516076 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ee5d76-294e-418f-814d-d47f30cf8b16" containerName="proxy-httpd" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.525112 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.528971 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.529047 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.530160 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.534940 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.642717 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fd2de5e-879f-48ce-86d9-175baea81ab6-run-httpd\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.643217 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-scripts\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.643301 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.643334 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fd2de5e-879f-48ce-86d9-175baea81ab6-log-httpd\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.643370 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8gkl\" (UniqueName: \"kubernetes.io/projected/8fd2de5e-879f-48ce-86d9-175baea81ab6-kube-api-access-m8gkl\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.643390 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.643409 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-config-data\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.643427 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.745791 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.745871 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fd2de5e-879f-48ce-86d9-175baea81ab6-log-httpd\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.745921 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8gkl\" (UniqueName: \"kubernetes.io/projected/8fd2de5e-879f-48ce-86d9-175baea81ab6-kube-api-access-m8gkl\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.745957 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.745984 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-config-data\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.746010 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.746048 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fd2de5e-879f-48ce-86d9-175baea81ab6-run-httpd\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.746093 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-scripts\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.747873 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fd2de5e-879f-48ce-86d9-175baea81ab6-log-httpd\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.747989 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fd2de5e-879f-48ce-86d9-175baea81ab6-run-httpd\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.751607 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.751902 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-scripts\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.751946 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.753003 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.754216 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-config-data\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.769520 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8gkl\" (UniqueName: \"kubernetes.io/projected/8fd2de5e-879f-48ce-86d9-175baea81ab6-kube-api-access-m8gkl\") pod \"ceilometer-0\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " pod="openstack/ceilometer-0" Nov 22 03:12:19 crc kubenswrapper[4952]: I1122 03:12:19.849758 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:12:20 crc kubenswrapper[4952]: I1122 03:12:20.132146 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:20 crc kubenswrapper[4952]: I1122 03:12:20.402941 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fd2de5e-879f-48ce-86d9-175baea81ab6","Type":"ContainerStarted","Data":"145ef0de9c48b62d154eea2d105601d826c14dc15c480352f135222ba4c2bd20"} Nov 22 03:12:20 crc kubenswrapper[4952]: I1122 03:12:20.546004 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33ee5d76-294e-418f-814d-d47f30cf8b16" path="/var/lib/kubelet/pods/33ee5d76-294e-418f-814d-d47f30cf8b16/volumes" Nov 22 03:12:22 crc kubenswrapper[4952]: I1122 03:12:22.425010 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fd2de5e-879f-48ce-86d9-175baea81ab6","Type":"ContainerStarted","Data":"26d422bb25945cef6dbb561acd23faacf28d498c7759ecdb7c5a48fb8ee05580"} Nov 22 03:12:23 crc kubenswrapper[4952]: I1122 03:12:23.440062 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fd2de5e-879f-48ce-86d9-175baea81ab6","Type":"ContainerStarted","Data":"f93931a3e3d40cb7867242189499222e47756455cf76ebb398df68c36797c365"} Nov 22 03:12:23 crc kubenswrapper[4952]: I1122 03:12:23.441044 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fd2de5e-879f-48ce-86d9-175baea81ab6","Type":"ContainerStarted","Data":"bc2911a64e9cfb6608e61a63851d398f3d0d0d8c55253df31ddba567f427e8de"} Nov 22 03:12:25 crc kubenswrapper[4952]: I1122 03:12:25.462724 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fd2de5e-879f-48ce-86d9-175baea81ab6","Type":"ContainerStarted","Data":"4b0689697e29e11fcf3e3f10a63346d58bf9d4a396b4b6418ce3497a4f2051c5"} Nov 22 03:12:25 crc kubenswrapper[4952]: I1122 03:12:25.463311 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 03:12:25 crc kubenswrapper[4952]: I1122 03:12:25.490072 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.287777219 podStartE2EDuration="6.490049304s" podCreationTimestamp="2025-11-22 03:12:19 +0000 UTC" firstStartedPulling="2025-11-22 03:12:20.144991126 +0000 UTC m=+1104.451008409" lastFinishedPulling="2025-11-22 03:12:24.347263211 +0000 UTC m=+1108.653280494" observedRunningTime="2025-11-22 03:12:25.484021074 +0000 UTC m=+1109.790038377" watchObservedRunningTime="2025-11-22 03:12:25.490049304 +0000 UTC m=+1109.796066577" Nov 22 03:12:26 crc kubenswrapper[4952]: I1122 03:12:26.477589 4952 generic.go:334] "Generic (PLEG): container finished" podID="dff8e549-6708-4b20-acc9-411cf736985a" containerID="b0bc29ddcfabb4faf17c7ef461f2adc42889ee8b2d974dc2d1ea21a1a9a908e4" exitCode=0 Nov 22 03:12:26 crc kubenswrapper[4952]: I1122 03:12:26.477816 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rxtxb" event={"ID":"dff8e549-6708-4b20-acc9-411cf736985a","Type":"ContainerDied","Data":"b0bc29ddcfabb4faf17c7ef461f2adc42889ee8b2d974dc2d1ea21a1a9a908e4"} Nov 22 03:12:27 crc kubenswrapper[4952]: I1122 03:12:27.881487 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rxtxb" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.052121 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff8e549-6708-4b20-acc9-411cf736985a-scripts\") pod \"dff8e549-6708-4b20-acc9-411cf736985a\" (UID: \"dff8e549-6708-4b20-acc9-411cf736985a\") " Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.052481 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff8e549-6708-4b20-acc9-411cf736985a-config-data\") pod \"dff8e549-6708-4b20-acc9-411cf736985a\" (UID: \"dff8e549-6708-4b20-acc9-411cf736985a\") " Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.052521 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff8e549-6708-4b20-acc9-411cf736985a-combined-ca-bundle\") pod \"dff8e549-6708-4b20-acc9-411cf736985a\" (UID: \"dff8e549-6708-4b20-acc9-411cf736985a\") " Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.052611 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw6zp\" (UniqueName: \"kubernetes.io/projected/dff8e549-6708-4b20-acc9-411cf736985a-kube-api-access-pw6zp\") pod \"dff8e549-6708-4b20-acc9-411cf736985a\" (UID: \"dff8e549-6708-4b20-acc9-411cf736985a\") " Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.061532 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff8e549-6708-4b20-acc9-411cf736985a-scripts" (OuterVolumeSpecName: "scripts") pod "dff8e549-6708-4b20-acc9-411cf736985a" (UID: "dff8e549-6708-4b20-acc9-411cf736985a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.062266 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dff8e549-6708-4b20-acc9-411cf736985a-kube-api-access-pw6zp" (OuterVolumeSpecName: "kube-api-access-pw6zp") pod "dff8e549-6708-4b20-acc9-411cf736985a" (UID: "dff8e549-6708-4b20-acc9-411cf736985a"). InnerVolumeSpecName "kube-api-access-pw6zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.091136 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff8e549-6708-4b20-acc9-411cf736985a-config-data" (OuterVolumeSpecName: "config-data") pod "dff8e549-6708-4b20-acc9-411cf736985a" (UID: "dff8e549-6708-4b20-acc9-411cf736985a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.093818 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff8e549-6708-4b20-acc9-411cf736985a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dff8e549-6708-4b20-acc9-411cf736985a" (UID: "dff8e549-6708-4b20-acc9-411cf736985a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.155246 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff8e549-6708-4b20-acc9-411cf736985a-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.155289 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw6zp\" (UniqueName: \"kubernetes.io/projected/dff8e549-6708-4b20-acc9-411cf736985a-kube-api-access-pw6zp\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.155301 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff8e549-6708-4b20-acc9-411cf736985a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.155311 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff8e549-6708-4b20-acc9-411cf736985a-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.341825 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.341892 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.512764 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rxtxb" event={"ID":"dff8e549-6708-4b20-acc9-411cf736985a","Type":"ContainerDied","Data":"18464b662f147a6d872e4ea23bf73619870959dc6af99846a87d988b84266c53"} Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.512826 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18464b662f147a6d872e4ea23bf73619870959dc6af99846a87d988b84266c53" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.512874 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rxtxb" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.691392 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 03:12:28 crc kubenswrapper[4952]: E1122 03:12:28.691813 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff8e549-6708-4b20-acc9-411cf736985a" containerName="nova-cell0-conductor-db-sync" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.691828 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff8e549-6708-4b20-acc9-411cf736985a" containerName="nova-cell0-conductor-db-sync" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.692046 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff8e549-6708-4b20-acc9-411cf736985a" containerName="nova-cell0-conductor-db-sync" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.692711 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.695356 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-8hwjw" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.698647 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.728697 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.771491 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed8df98-1a16-4c89-b60b-c3589ec701be-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1ed8df98-1a16-4c89-b60b-c3589ec701be\") " pod="openstack/nova-cell0-conductor-0" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.771596 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wtfj\" (UniqueName: \"kubernetes.io/projected/1ed8df98-1a16-4c89-b60b-c3589ec701be-kube-api-access-7wtfj\") pod \"nova-cell0-conductor-0\" (UID: \"1ed8df98-1a16-4c89-b60b-c3589ec701be\") " pod="openstack/nova-cell0-conductor-0" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.771626 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed8df98-1a16-4c89-b60b-c3589ec701be-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1ed8df98-1a16-4c89-b60b-c3589ec701be\") " pod="openstack/nova-cell0-conductor-0" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.873413 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed8df98-1a16-4c89-b60b-c3589ec701be-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1ed8df98-1a16-4c89-b60b-c3589ec701be\") " pod="openstack/nova-cell0-conductor-0" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.873502 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wtfj\" (UniqueName: \"kubernetes.io/projected/1ed8df98-1a16-4c89-b60b-c3589ec701be-kube-api-access-7wtfj\") pod \"nova-cell0-conductor-0\" (UID: \"1ed8df98-1a16-4c89-b60b-c3589ec701be\") " pod="openstack/nova-cell0-conductor-0" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.873537 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed8df98-1a16-4c89-b60b-c3589ec701be-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1ed8df98-1a16-4c89-b60b-c3589ec701be\") " pod="openstack/nova-cell0-conductor-0" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.879092 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed8df98-1a16-4c89-b60b-c3589ec701be-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1ed8df98-1a16-4c89-b60b-c3589ec701be\") " pod="openstack/nova-cell0-conductor-0" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.879189 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed8df98-1a16-4c89-b60b-c3589ec701be-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1ed8df98-1a16-4c89-b60b-c3589ec701be\") " pod="openstack/nova-cell0-conductor-0" Nov 22 03:12:28 crc kubenswrapper[4952]: I1122 03:12:28.895127 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wtfj\" (UniqueName: \"kubernetes.io/projected/1ed8df98-1a16-4c89-b60b-c3589ec701be-kube-api-access-7wtfj\") pod \"nova-cell0-conductor-0\" (UID: \"1ed8df98-1a16-4c89-b60b-c3589ec701be\") " pod="openstack/nova-cell0-conductor-0" Nov 22 03:12:29 crc kubenswrapper[4952]: I1122 03:12:29.009613 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 03:12:29 crc kubenswrapper[4952]: I1122 03:12:29.544878 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 03:12:30 crc kubenswrapper[4952]: I1122 03:12:30.550449 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1ed8df98-1a16-4c89-b60b-c3589ec701be","Type":"ContainerStarted","Data":"c00ba9db08b98c005cac2247d24a3ad5be43ee146949875a099bf1170592b8f8"} Nov 22 03:12:30 crc kubenswrapper[4952]: I1122 03:12:30.550890 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 22 03:12:30 crc kubenswrapper[4952]: I1122 03:12:30.550905 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1ed8df98-1a16-4c89-b60b-c3589ec701be","Type":"ContainerStarted","Data":"f09ad537bd932068afe336cac1444759d123153970188ecc3f90e1a20fe6326b"} Nov 22 03:12:30 crc kubenswrapper[4952]: I1122 03:12:30.579676 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.579646021 podStartE2EDuration="2.579646021s" podCreationTimestamp="2025-11-22 03:12:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:12:30.57432631 +0000 UTC m=+1114.880343603" watchObservedRunningTime="2025-11-22 03:12:30.579646021 +0000 UTC m=+1114.885663304" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.059466 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.582725 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-ng2bq"] Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.584946 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ng2bq" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.591657 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.592368 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.603421 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-ng2bq"] Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.711938 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx2ds\" (UniqueName: \"kubernetes.io/projected/8e7d5803-2a94-4be7-874e-59415a346d19-kube-api-access-cx2ds\") pod \"nova-cell0-cell-mapping-ng2bq\" (UID: \"8e7d5803-2a94-4be7-874e-59415a346d19\") " pod="openstack/nova-cell0-cell-mapping-ng2bq" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.712035 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e7d5803-2a94-4be7-874e-59415a346d19-scripts\") pod \"nova-cell0-cell-mapping-ng2bq\" (UID: \"8e7d5803-2a94-4be7-874e-59415a346d19\") " pod="openstack/nova-cell0-cell-mapping-ng2bq" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.712056 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e7d5803-2a94-4be7-874e-59415a346d19-config-data\") pod \"nova-cell0-cell-mapping-ng2bq\" (UID: \"8e7d5803-2a94-4be7-874e-59415a346d19\") " pod="openstack/nova-cell0-cell-mapping-ng2bq" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.712094 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e7d5803-2a94-4be7-874e-59415a346d19-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ng2bq\" (UID: \"8e7d5803-2a94-4be7-874e-59415a346d19\") " pod="openstack/nova-cell0-cell-mapping-ng2bq" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.799533 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.801097 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.805181 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.815095 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx2ds\" (UniqueName: \"kubernetes.io/projected/8e7d5803-2a94-4be7-874e-59415a346d19-kube-api-access-cx2ds\") pod \"nova-cell0-cell-mapping-ng2bq\" (UID: \"8e7d5803-2a94-4be7-874e-59415a346d19\") " pod="openstack/nova-cell0-cell-mapping-ng2bq" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.815159 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e7d5803-2a94-4be7-874e-59415a346d19-scripts\") pod \"nova-cell0-cell-mapping-ng2bq\" (UID: \"8e7d5803-2a94-4be7-874e-59415a346d19\") " pod="openstack/nova-cell0-cell-mapping-ng2bq" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.815184 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e7d5803-2a94-4be7-874e-59415a346d19-config-data\") pod \"nova-cell0-cell-mapping-ng2bq\" (UID: \"8e7d5803-2a94-4be7-874e-59415a346d19\") " pod="openstack/nova-cell0-cell-mapping-ng2bq" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.815214 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e7d5803-2a94-4be7-874e-59415a346d19-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ng2bq\" (UID: \"8e7d5803-2a94-4be7-874e-59415a346d19\") " pod="openstack/nova-cell0-cell-mapping-ng2bq" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.820972 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.834442 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e7d5803-2a94-4be7-874e-59415a346d19-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ng2bq\" (UID: \"8e7d5803-2a94-4be7-874e-59415a346d19\") " pod="openstack/nova-cell0-cell-mapping-ng2bq" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.836997 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e7d5803-2a94-4be7-874e-59415a346d19-scripts\") pod \"nova-cell0-cell-mapping-ng2bq\" (UID: \"8e7d5803-2a94-4be7-874e-59415a346d19\") " pod="openstack/nova-cell0-cell-mapping-ng2bq" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.852121 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e7d5803-2a94-4be7-874e-59415a346d19-config-data\") pod \"nova-cell0-cell-mapping-ng2bq\" (UID: \"8e7d5803-2a94-4be7-874e-59415a346d19\") " pod="openstack/nova-cell0-cell-mapping-ng2bq" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.902249 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx2ds\" (UniqueName: \"kubernetes.io/projected/8e7d5803-2a94-4be7-874e-59415a346d19-kube-api-access-cx2ds\") pod \"nova-cell0-cell-mapping-ng2bq\" (UID: \"8e7d5803-2a94-4be7-874e-59415a346d19\") " pod="openstack/nova-cell0-cell-mapping-ng2bq" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.913982 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ng2bq" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.918596 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2ckl\" (UniqueName: \"kubernetes.io/projected/f6f78980-48c3-49d7-8127-6d06c53df6f8-kube-api-access-r2ckl\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6f78980-48c3-49d7-8127-6d06c53df6f8\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.919128 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6f78980-48c3-49d7-8127-6d06c53df6f8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6f78980-48c3-49d7-8127-6d06c53df6f8\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.919358 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6f78980-48c3-49d7-8127-6d06c53df6f8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6f78980-48c3-49d7-8127-6d06c53df6f8\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.934874 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.936264 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.939215 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.943734 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.945918 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.957650 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 03:12:34 crc kubenswrapper[4952]: I1122 03:12:34.966700 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.015848 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.023991 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2ckl\" (UniqueName: \"kubernetes.io/projected/f6f78980-48c3-49d7-8127-6d06c53df6f8-kube-api-access-r2ckl\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6f78980-48c3-49d7-8127-6d06c53df6f8\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.024883 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm8l8\" (UniqueName: \"kubernetes.io/projected/5513e5e6-3481-4787-9c6e-ead3418a2137-kube-api-access-tm8l8\") pod \"nova-scheduler-0\" (UID: \"5513e5e6-3481-4787-9c6e-ead3418a2137\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.025027 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4288052-cfd5-44a1-b156-06a7ee436d82-logs\") pod \"nova-api-0\" (UID: \"d4288052-cfd5-44a1-b156-06a7ee436d82\") " pod="openstack/nova-api-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.025169 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4288052-cfd5-44a1-b156-06a7ee436d82-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d4288052-cfd5-44a1-b156-06a7ee436d82\") " pod="openstack/nova-api-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.025265 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6f78980-48c3-49d7-8127-6d06c53df6f8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6f78980-48c3-49d7-8127-6d06c53df6f8\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.025386 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6f78980-48c3-49d7-8127-6d06c53df6f8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6f78980-48c3-49d7-8127-6d06c53df6f8\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.025508 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lln7w\" (UniqueName: \"kubernetes.io/projected/d4288052-cfd5-44a1-b156-06a7ee436d82-kube-api-access-lln7w\") pod \"nova-api-0\" (UID: \"d4288052-cfd5-44a1-b156-06a7ee436d82\") " pod="openstack/nova-api-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.025638 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4288052-cfd5-44a1-b156-06a7ee436d82-config-data\") pod \"nova-api-0\" (UID: \"d4288052-cfd5-44a1-b156-06a7ee436d82\") " pod="openstack/nova-api-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.025743 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5513e5e6-3481-4787-9c6e-ead3418a2137-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5513e5e6-3481-4787-9c6e-ead3418a2137\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.025870 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5513e5e6-3481-4787-9c6e-ead3418a2137-config-data\") pod \"nova-scheduler-0\" (UID: \"5513e5e6-3481-4787-9c6e-ead3418a2137\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.031989 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6f78980-48c3-49d7-8127-6d06c53df6f8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6f78980-48c3-49d7-8127-6d06c53df6f8\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.032003 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6f78980-48c3-49d7-8127-6d06c53df6f8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6f78980-48c3-49d7-8127-6d06c53df6f8\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.058600 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2ckl\" (UniqueName: \"kubernetes.io/projected/f6f78980-48c3-49d7-8127-6d06c53df6f8-kube-api-access-r2ckl\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6f78980-48c3-49d7-8127-6d06c53df6f8\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.132296 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.145475 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4288052-cfd5-44a1-b156-06a7ee436d82-logs\") pod \"nova-api-0\" (UID: \"d4288052-cfd5-44a1-b156-06a7ee436d82\") " pod="openstack/nova-api-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.145661 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4288052-cfd5-44a1-b156-06a7ee436d82-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d4288052-cfd5-44a1-b156-06a7ee436d82\") " pod="openstack/nova-api-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.145871 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lln7w\" (UniqueName: \"kubernetes.io/projected/d4288052-cfd5-44a1-b156-06a7ee436d82-kube-api-access-lln7w\") pod \"nova-api-0\" (UID: \"d4288052-cfd5-44a1-b156-06a7ee436d82\") " pod="openstack/nova-api-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.145988 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4288052-cfd5-44a1-b156-06a7ee436d82-config-data\") pod \"nova-api-0\" (UID: \"d4288052-cfd5-44a1-b156-06a7ee436d82\") " pod="openstack/nova-api-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.146042 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5513e5e6-3481-4787-9c6e-ead3418a2137-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5513e5e6-3481-4787-9c6e-ead3418a2137\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.146145 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5513e5e6-3481-4787-9c6e-ead3418a2137-config-data\") pod \"nova-scheduler-0\" (UID: \"5513e5e6-3481-4787-9c6e-ead3418a2137\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.146255 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm8l8\" (UniqueName: \"kubernetes.io/projected/5513e5e6-3481-4787-9c6e-ead3418a2137-kube-api-access-tm8l8\") pod \"nova-scheduler-0\" (UID: \"5513e5e6-3481-4787-9c6e-ead3418a2137\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.148350 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4288052-cfd5-44a1-b156-06a7ee436d82-logs\") pod \"nova-api-0\" (UID: \"d4288052-cfd5-44a1-b156-06a7ee436d82\") " pod="openstack/nova-api-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.156616 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.158388 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4288052-cfd5-44a1-b156-06a7ee436d82-config-data\") pod \"nova-api-0\" (UID: \"d4288052-cfd5-44a1-b156-06a7ee436d82\") " pod="openstack/nova-api-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.160820 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5513e5e6-3481-4787-9c6e-ead3418a2137-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5513e5e6-3481-4787-9c6e-ead3418a2137\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.161883 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5513e5e6-3481-4787-9c6e-ead3418a2137-config-data\") pod \"nova-scheduler-0\" (UID: \"5513e5e6-3481-4787-9c6e-ead3418a2137\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.200201 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.203826 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4288052-cfd5-44a1-b156-06a7ee436d82-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d4288052-cfd5-44a1-b156-06a7ee436d82\") " pod="openstack/nova-api-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.209289 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.211192 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.211969 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lln7w\" (UniqueName: \"kubernetes.io/projected/d4288052-cfd5-44a1-b156-06a7ee436d82-kube-api-access-lln7w\") pod \"nova-api-0\" (UID: \"d4288052-cfd5-44a1-b156-06a7ee436d82\") " pod="openstack/nova-api-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.220289 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm8l8\" (UniqueName: \"kubernetes.io/projected/5513e5e6-3481-4787-9c6e-ead3418a2137-kube-api-access-tm8l8\") pod \"nova-scheduler-0\" (UID: \"5513e5e6-3481-4787-9c6e-ead3418a2137\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.220403 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-pqrg5"] Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.230739 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-pqrg5"] Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.230895 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.250315 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e3d611-ff14-4db8-967c-45ccac308355-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"30e3d611-ff14-4db8-967c-45ccac308355\") " pod="openstack/nova-metadata-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.256959 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e3d611-ff14-4db8-967c-45ccac308355-config-data\") pod \"nova-metadata-0\" (UID: \"30e3d611-ff14-4db8-967c-45ccac308355\") " pod="openstack/nova-metadata-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.257177 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9wdk\" (UniqueName: \"kubernetes.io/projected/30e3d611-ff14-4db8-967c-45ccac308355-kube-api-access-v9wdk\") pod \"nova-metadata-0\" (UID: \"30e3d611-ff14-4db8-967c-45ccac308355\") " pod="openstack/nova-metadata-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.257839 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30e3d611-ff14-4db8-967c-45ccac308355-logs\") pod \"nova-metadata-0\" (UID: \"30e3d611-ff14-4db8-967c-45ccac308355\") " pod="openstack/nova-metadata-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.367265 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v27q\" (UniqueName: \"kubernetes.io/projected/3c23dea9-e99e-4527-902f-dc7280730cd3-kube-api-access-9v27q\") pod \"dnsmasq-dns-566b5b7845-pqrg5\" (UID: \"3c23dea9-e99e-4527-902f-dc7280730cd3\") " pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.367319 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e3d611-ff14-4db8-967c-45ccac308355-config-data\") pod \"nova-metadata-0\" (UID: \"30e3d611-ff14-4db8-967c-45ccac308355\") " pod="openstack/nova-metadata-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.367353 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9wdk\" (UniqueName: \"kubernetes.io/projected/30e3d611-ff14-4db8-967c-45ccac308355-kube-api-access-v9wdk\") pod \"nova-metadata-0\" (UID: \"30e3d611-ff14-4db8-967c-45ccac308355\") " pod="openstack/nova-metadata-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.367487 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-pqrg5\" (UID: \"3c23dea9-e99e-4527-902f-dc7280730cd3\") " pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.367689 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30e3d611-ff14-4db8-967c-45ccac308355-logs\") pod \"nova-metadata-0\" (UID: \"30e3d611-ff14-4db8-967c-45ccac308355\") " pod="openstack/nova-metadata-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.367767 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-dns-svc\") pod \"dnsmasq-dns-566b5b7845-pqrg5\" (UID: \"3c23dea9-e99e-4527-902f-dc7280730cd3\") " pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.367821 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-config\") pod \"dnsmasq-dns-566b5b7845-pqrg5\" (UID: \"3c23dea9-e99e-4527-902f-dc7280730cd3\") " pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.367890 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-pqrg5\" (UID: \"3c23dea9-e99e-4527-902f-dc7280730cd3\") " pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.368042 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e3d611-ff14-4db8-967c-45ccac308355-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"30e3d611-ff14-4db8-967c-45ccac308355\") " pod="openstack/nova-metadata-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.368042 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30e3d611-ff14-4db8-967c-45ccac308355-logs\") pod \"nova-metadata-0\" (UID: \"30e3d611-ff14-4db8-967c-45ccac308355\") " pod="openstack/nova-metadata-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.372641 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e3d611-ff14-4db8-967c-45ccac308355-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"30e3d611-ff14-4db8-967c-45ccac308355\") " pod="openstack/nova-metadata-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.386415 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9wdk\" (UniqueName: \"kubernetes.io/projected/30e3d611-ff14-4db8-967c-45ccac308355-kube-api-access-v9wdk\") pod \"nova-metadata-0\" (UID: \"30e3d611-ff14-4db8-967c-45ccac308355\") " pod="openstack/nova-metadata-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.391105 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e3d611-ff14-4db8-967c-45ccac308355-config-data\") pod \"nova-metadata-0\" (UID: \"30e3d611-ff14-4db8-967c-45ccac308355\") " pod="openstack/nova-metadata-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.420840 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.435341 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.470479 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-dns-svc\") pod \"dnsmasq-dns-566b5b7845-pqrg5\" (UID: \"3c23dea9-e99e-4527-902f-dc7280730cd3\") " pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.470538 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-config\") pod \"dnsmasq-dns-566b5b7845-pqrg5\" (UID: \"3c23dea9-e99e-4527-902f-dc7280730cd3\") " pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.470583 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-pqrg5\" (UID: \"3c23dea9-e99e-4527-902f-dc7280730cd3\") " pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.470671 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v27q\" (UniqueName: \"kubernetes.io/projected/3c23dea9-e99e-4527-902f-dc7280730cd3-kube-api-access-9v27q\") pod \"dnsmasq-dns-566b5b7845-pqrg5\" (UID: \"3c23dea9-e99e-4527-902f-dc7280730cd3\") " pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.470706 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-pqrg5\" (UID: \"3c23dea9-e99e-4527-902f-dc7280730cd3\") " pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.471790 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-dns-svc\") pod \"dnsmasq-dns-566b5b7845-pqrg5\" (UID: \"3c23dea9-e99e-4527-902f-dc7280730cd3\") " pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.471807 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-pqrg5\" (UID: \"3c23dea9-e99e-4527-902f-dc7280730cd3\") " pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.472166 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-config\") pod \"dnsmasq-dns-566b5b7845-pqrg5\" (UID: \"3c23dea9-e99e-4527-902f-dc7280730cd3\") " pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.472568 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-pqrg5\" (UID: \"3c23dea9-e99e-4527-902f-dc7280730cd3\") " pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.494628 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v27q\" (UniqueName: \"kubernetes.io/projected/3c23dea9-e99e-4527-902f-dc7280730cd3-kube-api-access-9v27q\") pod \"dnsmasq-dns-566b5b7845-pqrg5\" (UID: \"3c23dea9-e99e-4527-902f-dc7280730cd3\") " pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.551583 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.596705 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.684028 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-ng2bq"] Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.731472 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k62bf"] Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.733032 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-k62bf" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.739658 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.740099 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.775341 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k62bf"] Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.781166 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-k62bf\" (UID: \"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279\") " pod="openstack/nova-cell1-conductor-db-sync-k62bf" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.781230 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-scripts\") pod \"nova-cell1-conductor-db-sync-k62bf\" (UID: \"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279\") " pod="openstack/nova-cell1-conductor-db-sync-k62bf" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.781351 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-config-data\") pod \"nova-cell1-conductor-db-sync-k62bf\" (UID: \"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279\") " pod="openstack/nova-cell1-conductor-db-sync-k62bf" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.781408 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqhgf\" (UniqueName: \"kubernetes.io/projected/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-kube-api-access-dqhgf\") pod \"nova-cell1-conductor-db-sync-k62bf\" (UID: \"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279\") " pod="openstack/nova-cell1-conductor-db-sync-k62bf" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.885162 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-config-data\") pod \"nova-cell1-conductor-db-sync-k62bf\" (UID: \"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279\") " pod="openstack/nova-cell1-conductor-db-sync-k62bf" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.885253 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqhgf\" (UniqueName: \"kubernetes.io/projected/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-kube-api-access-dqhgf\") pod \"nova-cell1-conductor-db-sync-k62bf\" (UID: \"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279\") " pod="openstack/nova-cell1-conductor-db-sync-k62bf" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.885299 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-k62bf\" (UID: \"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279\") " pod="openstack/nova-cell1-conductor-db-sync-k62bf" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.885323 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-scripts\") pod \"nova-cell1-conductor-db-sync-k62bf\" (UID: \"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279\") " pod="openstack/nova-cell1-conductor-db-sync-k62bf" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.885440 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.927603 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-scripts\") pod \"nova-cell1-conductor-db-sync-k62bf\" (UID: \"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279\") " pod="openstack/nova-cell1-conductor-db-sync-k62bf" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.928181 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-k62bf\" (UID: \"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279\") " pod="openstack/nova-cell1-conductor-db-sync-k62bf" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.934646 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-config-data\") pod \"nova-cell1-conductor-db-sync-k62bf\" (UID: \"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279\") " pod="openstack/nova-cell1-conductor-db-sync-k62bf" Nov 22 03:12:35 crc kubenswrapper[4952]: I1122 03:12:35.940476 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqhgf\" (UniqueName: \"kubernetes.io/projected/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-kube-api-access-dqhgf\") pod \"nova-cell1-conductor-db-sync-k62bf\" (UID: \"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279\") " pod="openstack/nova-cell1-conductor-db-sync-k62bf" Nov 22 03:12:36 crc kubenswrapper[4952]: I1122 03:12:36.078811 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:12:36 crc kubenswrapper[4952]: I1122 03:12:36.086042 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-k62bf" Nov 22 03:12:36 crc kubenswrapper[4952]: W1122 03:12:36.272058 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4288052_cfd5_44a1_b156_06a7ee436d82.slice/crio-575b55a49e93f0893a9bdef6ffea3b1cc63d03d8b2c88cc5b7bec2c913f84fd0 WatchSource:0}: Error finding container 575b55a49e93f0893a9bdef6ffea3b1cc63d03d8b2c88cc5b7bec2c913f84fd0: Status 404 returned error can't find the container with id 575b55a49e93f0893a9bdef6ffea3b1cc63d03d8b2c88cc5b7bec2c913f84fd0 Nov 22 03:12:36 crc kubenswrapper[4952]: I1122 03:12:36.276186 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:36 crc kubenswrapper[4952]: W1122 03:12:36.509798 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30e3d611_ff14_4db8_967c_45ccac308355.slice/crio-e635b6c8c1f7cb1f1657936eba67bf0a6396e6b4c11950d1ffa26a0f8a54d3b1 WatchSource:0}: Error finding container e635b6c8c1f7cb1f1657936eba67bf0a6396e6b4c11950d1ffa26a0f8a54d3b1: Status 404 returned error can't find the container with id e635b6c8c1f7cb1f1657936eba67bf0a6396e6b4c11950d1ffa26a0f8a54d3b1 Nov 22 03:12:36 crc kubenswrapper[4952]: I1122 03:12:36.510181 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:36 crc kubenswrapper[4952]: I1122 03:12:36.656359 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5513e5e6-3481-4787-9c6e-ead3418a2137","Type":"ContainerStarted","Data":"ac230a931bb4f99b7deb864e202d02ca42a45c1a628d5d5544fdf5ac923b55ba"} Nov 22 03:12:36 crc kubenswrapper[4952]: I1122 03:12:36.676486 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ng2bq" event={"ID":"8e7d5803-2a94-4be7-874e-59415a346d19","Type":"ContainerStarted","Data":"9c1b04376b7b9ff7f3b32665c342ea62e578e26d4d3df5d6910626f8d4d20806"} Nov 22 03:12:36 crc kubenswrapper[4952]: I1122 03:12:36.677247 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ng2bq" event={"ID":"8e7d5803-2a94-4be7-874e-59415a346d19","Type":"ContainerStarted","Data":"7addce2bc7dcd885f7641549060bb1f773275dd1d66e320cf2c77a573c2a636f"} Nov 22 03:12:36 crc kubenswrapper[4952]: I1122 03:12:36.678711 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"30e3d611-ff14-4db8-967c-45ccac308355","Type":"ContainerStarted","Data":"e635b6c8c1f7cb1f1657936eba67bf0a6396e6b4c11950d1ffa26a0f8a54d3b1"} Nov 22 03:12:36 crc kubenswrapper[4952]: I1122 03:12:36.683838 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d4288052-cfd5-44a1-b156-06a7ee436d82","Type":"ContainerStarted","Data":"575b55a49e93f0893a9bdef6ffea3b1cc63d03d8b2c88cc5b7bec2c913f84fd0"} Nov 22 03:12:36 crc kubenswrapper[4952]: I1122 03:12:36.685657 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f6f78980-48c3-49d7-8127-6d06c53df6f8","Type":"ContainerStarted","Data":"2867ba8ca66a110d2d49fa497c5ff5320078596fa78a357ffb246f66ebeba14b"} Nov 22 03:12:36 crc kubenswrapper[4952]: I1122 03:12:36.696435 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-pqrg5"] Nov 22 03:12:36 crc kubenswrapper[4952]: I1122 03:12:36.722960 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-ng2bq" podStartSLOduration=2.722931945 podStartE2EDuration="2.722931945s" podCreationTimestamp="2025-11-22 03:12:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:12:36.694854516 +0000 UTC m=+1121.000871799" watchObservedRunningTime="2025-11-22 03:12:36.722931945 +0000 UTC m=+1121.028949228" Nov 22 03:12:36 crc kubenswrapper[4952]: I1122 03:12:36.838392 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k62bf"] Nov 22 03:12:37 crc kubenswrapper[4952]: I1122 03:12:37.698190 4952 generic.go:334] "Generic (PLEG): container finished" podID="3c23dea9-e99e-4527-902f-dc7280730cd3" containerID="22195f74a71bb4825a9ce31871f9a3ae45278d1ac01a28f2cfb8709d0240cbbc" exitCode=0 Nov 22 03:12:37 crc kubenswrapper[4952]: I1122 03:12:37.698265 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" event={"ID":"3c23dea9-e99e-4527-902f-dc7280730cd3","Type":"ContainerDied","Data":"22195f74a71bb4825a9ce31871f9a3ae45278d1ac01a28f2cfb8709d0240cbbc"} Nov 22 03:12:37 crc kubenswrapper[4952]: I1122 03:12:37.698643 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" event={"ID":"3c23dea9-e99e-4527-902f-dc7280730cd3","Type":"ContainerStarted","Data":"8fceefad12c3f40f24a9d016d7ae82029f1855b6f0fcb2e08894c3ee48ec74fe"} Nov 22 03:12:37 crc kubenswrapper[4952]: I1122 03:12:37.705698 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-k62bf" event={"ID":"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279","Type":"ContainerStarted","Data":"ae364a3668a54a6d4f786a4c0d76755aadde0432358b62b044d5b0f767ccc98b"} Nov 22 03:12:37 crc kubenswrapper[4952]: I1122 03:12:37.705744 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-k62bf" event={"ID":"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279","Type":"ContainerStarted","Data":"07212f47953784b4676bbbc43a531b5b41424adde36bda703b29c3e10641dbe0"} Nov 22 03:12:37 crc kubenswrapper[4952]: I1122 03:12:37.748972 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-k62bf" podStartSLOduration=2.748953284 podStartE2EDuration="2.748953284s" podCreationTimestamp="2025-11-22 03:12:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:12:37.74466445 +0000 UTC m=+1122.050681723" watchObservedRunningTime="2025-11-22 03:12:37.748953284 +0000 UTC m=+1122.054970557" Nov 22 03:12:39 crc kubenswrapper[4952]: I1122 03:12:39.089683 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:39 crc kubenswrapper[4952]: I1122 03:12:39.100856 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 03:12:40 crc kubenswrapper[4952]: I1122 03:12:40.801668 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d4288052-cfd5-44a1-b156-06a7ee436d82","Type":"ContainerStarted","Data":"375056e0277bb11196aae00a2a8b965456a69a78f4d67fe129dd5ebcb7ef9e72"} Nov 22 03:12:40 crc kubenswrapper[4952]: I1122 03:12:40.803825 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d4288052-cfd5-44a1-b156-06a7ee436d82","Type":"ContainerStarted","Data":"842c96526b437073cef231b947fb269736331f7dee1711886023affda9c4b430"} Nov 22 03:12:40 crc kubenswrapper[4952]: I1122 03:12:40.804431 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f6f78980-48c3-49d7-8127-6d06c53df6f8","Type":"ContainerStarted","Data":"b78e8050001c1e87e21a621dfeafce0e5e2d03698b7153634d2931731add067c"} Nov 22 03:12:40 crc kubenswrapper[4952]: I1122 03:12:40.804626 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="f6f78980-48c3-49d7-8127-6d06c53df6f8" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b78e8050001c1e87e21a621dfeafce0e5e2d03698b7153634d2931731add067c" gracePeriod=30 Nov 22 03:12:40 crc kubenswrapper[4952]: I1122 03:12:40.808383 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" event={"ID":"3c23dea9-e99e-4527-902f-dc7280730cd3","Type":"ContainerStarted","Data":"bc789f8d4149663f25d6cca074074343379003da58f29ba1e734769ce89e73b5"} Nov 22 03:12:40 crc kubenswrapper[4952]: I1122 03:12:40.808468 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:12:40 crc kubenswrapper[4952]: I1122 03:12:40.812242 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5513e5e6-3481-4787-9c6e-ead3418a2137","Type":"ContainerStarted","Data":"e0788b01d54b79ce2ee8ac9b649cd5c8a69830d38523c4406d04ef0b71fcce55"} Nov 22 03:12:40 crc kubenswrapper[4952]: I1122 03:12:40.816061 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"30e3d611-ff14-4db8-967c-45ccac308355","Type":"ContainerStarted","Data":"58a9adfc06618d7b63f9c4b03d4b5f4cda93513675e58cf4340e01a382ff2e97"} Nov 22 03:12:40 crc kubenswrapper[4952]: I1122 03:12:40.816127 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"30e3d611-ff14-4db8-967c-45ccac308355","Type":"ContainerStarted","Data":"0786672713b82a58119dcdf8b417ffabb56ef0b47583f9fa66be03d643fd2fd7"} Nov 22 03:12:40 crc kubenswrapper[4952]: I1122 03:12:40.816237 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="30e3d611-ff14-4db8-967c-45ccac308355" containerName="nova-metadata-log" containerID="cri-o://58a9adfc06618d7b63f9c4b03d4b5f4cda93513675e58cf4340e01a382ff2e97" gracePeriod=30 Nov 22 03:12:40 crc kubenswrapper[4952]: I1122 03:12:40.816278 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="30e3d611-ff14-4db8-967c-45ccac308355" containerName="nova-metadata-metadata" containerID="cri-o://0786672713b82a58119dcdf8b417ffabb56ef0b47583f9fa66be03d643fd2fd7" gracePeriod=30 Nov 22 03:12:40 crc kubenswrapper[4952]: I1122 03:12:40.848256 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.77767218 podStartE2EDuration="6.848202687s" podCreationTimestamp="2025-11-22 03:12:34 +0000 UTC" firstStartedPulling="2025-11-22 03:12:36.274800275 +0000 UTC m=+1120.580817548" lastFinishedPulling="2025-11-22 03:12:39.345330752 +0000 UTC m=+1123.651348055" observedRunningTime="2025-11-22 03:12:40.826670203 +0000 UTC m=+1125.132687486" watchObservedRunningTime="2025-11-22 03:12:40.848202687 +0000 UTC m=+1125.154219960" Nov 22 03:12:40 crc kubenswrapper[4952]: I1122 03:12:40.858691 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" podStartSLOduration=5.858666706 podStartE2EDuration="5.858666706s" podCreationTimestamp="2025-11-22 03:12:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:12:40.856798967 +0000 UTC m=+1125.162816240" watchObservedRunningTime="2025-11-22 03:12:40.858666706 +0000 UTC m=+1125.164683979" Nov 22 03:12:40 crc kubenswrapper[4952]: I1122 03:12:40.881031 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.684127016 podStartE2EDuration="6.881003642s" podCreationTimestamp="2025-11-22 03:12:34 +0000 UTC" firstStartedPulling="2025-11-22 03:12:36.157055065 +0000 UTC m=+1120.463072338" lastFinishedPulling="2025-11-22 03:12:39.353931671 +0000 UTC m=+1123.659948964" observedRunningTime="2025-11-22 03:12:40.874633892 +0000 UTC m=+1125.180651175" watchObservedRunningTime="2025-11-22 03:12:40.881003642 +0000 UTC m=+1125.187020925" Nov 22 03:12:40 crc kubenswrapper[4952]: I1122 03:12:40.897162 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.062460623 podStartE2EDuration="5.897135452s" podCreationTimestamp="2025-11-22 03:12:35 +0000 UTC" firstStartedPulling="2025-11-22 03:12:36.517485306 +0000 UTC m=+1120.823502579" lastFinishedPulling="2025-11-22 03:12:39.352160135 +0000 UTC m=+1123.658177408" observedRunningTime="2025-11-22 03:12:40.893658989 +0000 UTC m=+1125.199676272" watchObservedRunningTime="2025-11-22 03:12:40.897135452 +0000 UTC m=+1125.203152725" Nov 22 03:12:40 crc kubenswrapper[4952]: I1122 03:12:40.918652 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.62878601 podStartE2EDuration="6.918628035s" podCreationTimestamp="2025-11-22 03:12:34 +0000 UTC" firstStartedPulling="2025-11-22 03:12:36.055449846 +0000 UTC m=+1120.361467119" lastFinishedPulling="2025-11-22 03:12:39.345291871 +0000 UTC m=+1123.651309144" observedRunningTime="2025-11-22 03:12:40.909908552 +0000 UTC m=+1125.215925825" watchObservedRunningTime="2025-11-22 03:12:40.918628035 +0000 UTC m=+1125.224645308" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.403904 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.522175 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9wdk\" (UniqueName: \"kubernetes.io/projected/30e3d611-ff14-4db8-967c-45ccac308355-kube-api-access-v9wdk\") pod \"30e3d611-ff14-4db8-967c-45ccac308355\" (UID: \"30e3d611-ff14-4db8-967c-45ccac308355\") " Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.522736 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30e3d611-ff14-4db8-967c-45ccac308355-logs\") pod \"30e3d611-ff14-4db8-967c-45ccac308355\" (UID: \"30e3d611-ff14-4db8-967c-45ccac308355\") " Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.522850 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e3d611-ff14-4db8-967c-45ccac308355-combined-ca-bundle\") pod \"30e3d611-ff14-4db8-967c-45ccac308355\" (UID: \"30e3d611-ff14-4db8-967c-45ccac308355\") " Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.522919 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e3d611-ff14-4db8-967c-45ccac308355-config-data\") pod \"30e3d611-ff14-4db8-967c-45ccac308355\" (UID: \"30e3d611-ff14-4db8-967c-45ccac308355\") " Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.523181 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30e3d611-ff14-4db8-967c-45ccac308355-logs" (OuterVolumeSpecName: "logs") pod "30e3d611-ff14-4db8-967c-45ccac308355" (UID: "30e3d611-ff14-4db8-967c-45ccac308355"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.523855 4952 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30e3d611-ff14-4db8-967c-45ccac308355-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.529913 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30e3d611-ff14-4db8-967c-45ccac308355-kube-api-access-v9wdk" (OuterVolumeSpecName: "kube-api-access-v9wdk") pod "30e3d611-ff14-4db8-967c-45ccac308355" (UID: "30e3d611-ff14-4db8-967c-45ccac308355"). InnerVolumeSpecName "kube-api-access-v9wdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.573367 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e3d611-ff14-4db8-967c-45ccac308355-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30e3d611-ff14-4db8-967c-45ccac308355" (UID: "30e3d611-ff14-4db8-967c-45ccac308355"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.576473 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e3d611-ff14-4db8-967c-45ccac308355-config-data" (OuterVolumeSpecName: "config-data") pod "30e3d611-ff14-4db8-967c-45ccac308355" (UID: "30e3d611-ff14-4db8-967c-45ccac308355"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.626004 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e3d611-ff14-4db8-967c-45ccac308355-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.626056 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e3d611-ff14-4db8-967c-45ccac308355-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.626070 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9wdk\" (UniqueName: \"kubernetes.io/projected/30e3d611-ff14-4db8-967c-45ccac308355-kube-api-access-v9wdk\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.838263 4952 generic.go:334] "Generic (PLEG): container finished" podID="30e3d611-ff14-4db8-967c-45ccac308355" containerID="0786672713b82a58119dcdf8b417ffabb56ef0b47583f9fa66be03d643fd2fd7" exitCode=0 Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.838341 4952 generic.go:334] "Generic (PLEG): container finished" podID="30e3d611-ff14-4db8-967c-45ccac308355" containerID="58a9adfc06618d7b63f9c4b03d4b5f4cda93513675e58cf4340e01a382ff2e97" exitCode=143 Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.840576 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.842846 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"30e3d611-ff14-4db8-967c-45ccac308355","Type":"ContainerDied","Data":"0786672713b82a58119dcdf8b417ffabb56ef0b47583f9fa66be03d643fd2fd7"} Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.842955 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"30e3d611-ff14-4db8-967c-45ccac308355","Type":"ContainerDied","Data":"58a9adfc06618d7b63f9c4b03d4b5f4cda93513675e58cf4340e01a382ff2e97"} Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.842980 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"30e3d611-ff14-4db8-967c-45ccac308355","Type":"ContainerDied","Data":"e635b6c8c1f7cb1f1657936eba67bf0a6396e6b4c11950d1ffa26a0f8a54d3b1"} Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.843014 4952 scope.go:117] "RemoveContainer" containerID="0786672713b82a58119dcdf8b417ffabb56ef0b47583f9fa66be03d643fd2fd7" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.900968 4952 scope.go:117] "RemoveContainer" containerID="58a9adfc06618d7b63f9c4b03d4b5f4cda93513675e58cf4340e01a382ff2e97" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.904701 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.918054 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.929966 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:41 crc kubenswrapper[4952]: E1122 03:12:41.931003 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e3d611-ff14-4db8-967c-45ccac308355" containerName="nova-metadata-log" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.931028 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e3d611-ff14-4db8-967c-45ccac308355" containerName="nova-metadata-log" Nov 22 03:12:41 crc kubenswrapper[4952]: E1122 03:12:41.931049 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e3d611-ff14-4db8-967c-45ccac308355" containerName="nova-metadata-metadata" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.931057 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e3d611-ff14-4db8-967c-45ccac308355" containerName="nova-metadata-metadata" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.931259 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="30e3d611-ff14-4db8-967c-45ccac308355" containerName="nova-metadata-log" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.931286 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="30e3d611-ff14-4db8-967c-45ccac308355" containerName="nova-metadata-metadata" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.932668 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.938374 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.938761 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.940818 4952 scope.go:117] "RemoveContainer" containerID="0786672713b82a58119dcdf8b417ffabb56ef0b47583f9fa66be03d643fd2fd7" Nov 22 03:12:41 crc kubenswrapper[4952]: E1122 03:12:41.941402 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0786672713b82a58119dcdf8b417ffabb56ef0b47583f9fa66be03d643fd2fd7\": container with ID starting with 0786672713b82a58119dcdf8b417ffabb56ef0b47583f9fa66be03d643fd2fd7 not found: ID does not exist" containerID="0786672713b82a58119dcdf8b417ffabb56ef0b47583f9fa66be03d643fd2fd7" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.941449 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0786672713b82a58119dcdf8b417ffabb56ef0b47583f9fa66be03d643fd2fd7"} err="failed to get container status \"0786672713b82a58119dcdf8b417ffabb56ef0b47583f9fa66be03d643fd2fd7\": rpc error: code = NotFound desc = could not find container \"0786672713b82a58119dcdf8b417ffabb56ef0b47583f9fa66be03d643fd2fd7\": container with ID starting with 0786672713b82a58119dcdf8b417ffabb56ef0b47583f9fa66be03d643fd2fd7 not found: ID does not exist" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.941477 4952 scope.go:117] "RemoveContainer" containerID="58a9adfc06618d7b63f9c4b03d4b5f4cda93513675e58cf4340e01a382ff2e97" Nov 22 03:12:41 crc kubenswrapper[4952]: E1122 03:12:41.941787 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58a9adfc06618d7b63f9c4b03d4b5f4cda93513675e58cf4340e01a382ff2e97\": container with ID starting with 58a9adfc06618d7b63f9c4b03d4b5f4cda93513675e58cf4340e01a382ff2e97 not found: ID does not exist" containerID="58a9adfc06618d7b63f9c4b03d4b5f4cda93513675e58cf4340e01a382ff2e97" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.941822 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58a9adfc06618d7b63f9c4b03d4b5f4cda93513675e58cf4340e01a382ff2e97"} err="failed to get container status \"58a9adfc06618d7b63f9c4b03d4b5f4cda93513675e58cf4340e01a382ff2e97\": rpc error: code = NotFound desc = could not find container \"58a9adfc06618d7b63f9c4b03d4b5f4cda93513675e58cf4340e01a382ff2e97\": container with ID starting with 58a9adfc06618d7b63f9c4b03d4b5f4cda93513675e58cf4340e01a382ff2e97 not found: ID does not exist" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.941849 4952 scope.go:117] "RemoveContainer" containerID="0786672713b82a58119dcdf8b417ffabb56ef0b47583f9fa66be03d643fd2fd7" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.942025 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0786672713b82a58119dcdf8b417ffabb56ef0b47583f9fa66be03d643fd2fd7"} err="failed to get container status \"0786672713b82a58119dcdf8b417ffabb56ef0b47583f9fa66be03d643fd2fd7\": rpc error: code = NotFound desc = could not find container \"0786672713b82a58119dcdf8b417ffabb56ef0b47583f9fa66be03d643fd2fd7\": container with ID starting with 0786672713b82a58119dcdf8b417ffabb56ef0b47583f9fa66be03d643fd2fd7 not found: ID does not exist" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.942048 4952 scope.go:117] "RemoveContainer" containerID="58a9adfc06618d7b63f9c4b03d4b5f4cda93513675e58cf4340e01a382ff2e97" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.942206 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58a9adfc06618d7b63f9c4b03d4b5f4cda93513675e58cf4340e01a382ff2e97"} err="failed to get container status \"58a9adfc06618d7b63f9c4b03d4b5f4cda93513675e58cf4340e01a382ff2e97\": rpc error: code = NotFound desc = could not find container \"58a9adfc06618d7b63f9c4b03d4b5f4cda93513675e58cf4340e01a382ff2e97\": container with ID starting with 58a9adfc06618d7b63f9c4b03d4b5f4cda93513675e58cf4340e01a382ff2e97 not found: ID does not exist" Nov 22 03:12:41 crc kubenswrapper[4952]: I1122 03:12:41.965833 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:42 crc kubenswrapper[4952]: I1122 03:12:42.034780 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-config-data\") pod \"nova-metadata-0\" (UID: \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\") " pod="openstack/nova-metadata-0" Nov 22 03:12:42 crc kubenswrapper[4952]: I1122 03:12:42.035146 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm6qt\" (UniqueName: \"kubernetes.io/projected/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-kube-api-access-pm6qt\") pod \"nova-metadata-0\" (UID: \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\") " pod="openstack/nova-metadata-0" Nov 22 03:12:42 crc kubenswrapper[4952]: I1122 03:12:42.035597 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-logs\") pod \"nova-metadata-0\" (UID: \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\") " pod="openstack/nova-metadata-0" Nov 22 03:12:42 crc kubenswrapper[4952]: I1122 03:12:42.035719 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\") " pod="openstack/nova-metadata-0" Nov 22 03:12:42 crc kubenswrapper[4952]: I1122 03:12:42.036755 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\") " pod="openstack/nova-metadata-0" Nov 22 03:12:42 crc kubenswrapper[4952]: I1122 03:12:42.139130 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\") " pod="openstack/nova-metadata-0" Nov 22 03:12:42 crc kubenswrapper[4952]: I1122 03:12:42.139258 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-config-data\") pod \"nova-metadata-0\" (UID: \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\") " pod="openstack/nova-metadata-0" Nov 22 03:12:42 crc kubenswrapper[4952]: I1122 03:12:42.139355 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm6qt\" (UniqueName: \"kubernetes.io/projected/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-kube-api-access-pm6qt\") pod \"nova-metadata-0\" (UID: \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\") " pod="openstack/nova-metadata-0" Nov 22 03:12:42 crc kubenswrapper[4952]: I1122 03:12:42.139477 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-logs\") pod \"nova-metadata-0\" (UID: \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\") " pod="openstack/nova-metadata-0" Nov 22 03:12:42 crc kubenswrapper[4952]: I1122 03:12:42.139573 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\") " pod="openstack/nova-metadata-0" Nov 22 03:12:42 crc kubenswrapper[4952]: I1122 03:12:42.140454 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-logs\") pod \"nova-metadata-0\" (UID: \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\") " pod="openstack/nova-metadata-0" Nov 22 03:12:42 crc kubenswrapper[4952]: I1122 03:12:42.148172 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\") " pod="openstack/nova-metadata-0" Nov 22 03:12:42 crc kubenswrapper[4952]: I1122 03:12:42.153050 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-config-data\") pod \"nova-metadata-0\" (UID: \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\") " pod="openstack/nova-metadata-0" Nov 22 03:12:42 crc kubenswrapper[4952]: I1122 03:12:42.158790 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\") " pod="openstack/nova-metadata-0" Nov 22 03:12:42 crc kubenswrapper[4952]: I1122 03:12:42.159366 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm6qt\" (UniqueName: \"kubernetes.io/projected/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-kube-api-access-pm6qt\") pod \"nova-metadata-0\" (UID: \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\") " pod="openstack/nova-metadata-0" Nov 22 03:12:42 crc kubenswrapper[4952]: I1122 03:12:42.264902 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:12:42 crc kubenswrapper[4952]: I1122 03:12:42.584436 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30e3d611-ff14-4db8-967c-45ccac308355" path="/var/lib/kubelet/pods/30e3d611-ff14-4db8-967c-45ccac308355/volumes" Nov 22 03:12:42 crc kubenswrapper[4952]: I1122 03:12:42.810170 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:42 crc kubenswrapper[4952]: I1122 03:12:42.868252 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856","Type":"ContainerStarted","Data":"c0c05b09b24dcd0a7bbba2bce8e1898969347a573c66c70afd1b967d59bd6342"} Nov 22 03:12:43 crc kubenswrapper[4952]: I1122 03:12:43.892896 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856","Type":"ContainerStarted","Data":"d67faeaebcedbb438161d872adc255bd7c3bfe826917cae0765875093146cdd8"} Nov 22 03:12:43 crc kubenswrapper[4952]: I1122 03:12:43.893785 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856","Type":"ContainerStarted","Data":"9b2a297f683c8c29c8c99b6d4633a3b1557340ea0a02f913847493b4aa21c3be"} Nov 22 03:12:43 crc kubenswrapper[4952]: I1122 03:12:43.928024 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.9280023120000003 podStartE2EDuration="2.928002312s" podCreationTimestamp="2025-11-22 03:12:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:12:43.925841564 +0000 UTC m=+1128.231858867" watchObservedRunningTime="2025-11-22 03:12:43.928002312 +0000 UTC m=+1128.234019575" Nov 22 03:12:44 crc kubenswrapper[4952]: I1122 03:12:44.908369 4952 generic.go:334] "Generic (PLEG): container finished" podID="8e7d5803-2a94-4be7-874e-59415a346d19" containerID="9c1b04376b7b9ff7f3b32665c342ea62e578e26d4d3df5d6910626f8d4d20806" exitCode=0 Nov 22 03:12:44 crc kubenswrapper[4952]: I1122 03:12:44.908732 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ng2bq" event={"ID":"8e7d5803-2a94-4be7-874e-59415a346d19","Type":"ContainerDied","Data":"9c1b04376b7b9ff7f3b32665c342ea62e578e26d4d3df5d6910626f8d4d20806"} Nov 22 03:12:44 crc kubenswrapper[4952]: I1122 03:12:44.913929 4952 generic.go:334] "Generic (PLEG): container finished" podID="ec9f0941-6a38-4fc6-bb6c-bdc23d78a279" containerID="ae364a3668a54a6d4f786a4c0d76755aadde0432358b62b044d5b0f767ccc98b" exitCode=0 Nov 22 03:12:44 crc kubenswrapper[4952]: I1122 03:12:44.913996 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-k62bf" event={"ID":"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279","Type":"ContainerDied","Data":"ae364a3668a54a6d4f786a4c0d76755aadde0432358b62b044d5b0f767ccc98b"} Nov 22 03:12:45 crc kubenswrapper[4952]: I1122 03:12:45.129365 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:45 crc kubenswrapper[4952]: I1122 03:12:45.422194 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 22 03:12:45 crc kubenswrapper[4952]: I1122 03:12:45.422953 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 22 03:12:45 crc kubenswrapper[4952]: I1122 03:12:45.436102 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4952]: I1122 03:12:45.436190 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4952]: I1122 03:12:45.475036 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 22 03:12:45 crc kubenswrapper[4952]: I1122 03:12:45.599840 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:12:45 crc kubenswrapper[4952]: I1122 03:12:45.689313 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-zxqbc"] Nov 22 03:12:45 crc kubenswrapper[4952]: I1122 03:12:45.924211 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" podUID="e8990647-b688-4c7d-acfa-c1287965cc3d" containerName="dnsmasq-dns" containerID="cri-o://f735e92f8358d3f5f801545fbc4d52dc6357804eb16e8398ceffe500e44d7711" gracePeriod=10 Nov 22 03:12:45 crc kubenswrapper[4952]: I1122 03:12:45.988264 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.414097 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-k62bf" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.528473 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d4288052-cfd5-44a1-b156-06a7ee436d82" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.170:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.528601 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d4288052-cfd5-44a1-b156-06a7ee436d82" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.170:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.549295 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-config-data\") pod \"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279\" (UID: \"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279\") " Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.549507 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-scripts\") pod \"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279\" (UID: \"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279\") " Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.549617 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-combined-ca-bundle\") pod \"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279\" (UID: \"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279\") " Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.549707 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqhgf\" (UniqueName: \"kubernetes.io/projected/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-kube-api-access-dqhgf\") pod \"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279\" (UID: \"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279\") " Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.555605 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-scripts" (OuterVolumeSpecName: "scripts") pod "ec9f0941-6a38-4fc6-bb6c-bdc23d78a279" (UID: "ec9f0941-6a38-4fc6-bb6c-bdc23d78a279"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.556431 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-kube-api-access-dqhgf" (OuterVolumeSpecName: "kube-api-access-dqhgf") pod "ec9f0941-6a38-4fc6-bb6c-bdc23d78a279" (UID: "ec9f0941-6a38-4fc6-bb6c-bdc23d78a279"). InnerVolumeSpecName "kube-api-access-dqhgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.585968 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-config-data" (OuterVolumeSpecName: "config-data") pod "ec9f0941-6a38-4fc6-bb6c-bdc23d78a279" (UID: "ec9f0941-6a38-4fc6-bb6c-bdc23d78a279"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.592594 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec9f0941-6a38-4fc6-bb6c-bdc23d78a279" (UID: "ec9f0941-6a38-4fc6-bb6c-bdc23d78a279"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.608747 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.609917 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ng2bq" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.652750 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.652797 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.652808 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqhgf\" (UniqueName: \"kubernetes.io/projected/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-kube-api-access-dqhgf\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.652818 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.754329 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-ovsdbserver-sb\") pod \"e8990647-b688-4c7d-acfa-c1287965cc3d\" (UID: \"e8990647-b688-4c7d-acfa-c1287965cc3d\") " Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.754378 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e7d5803-2a94-4be7-874e-59415a346d19-config-data\") pod \"8e7d5803-2a94-4be7-874e-59415a346d19\" (UID: \"8e7d5803-2a94-4be7-874e-59415a346d19\") " Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.754411 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-config\") pod \"e8990647-b688-4c7d-acfa-c1287965cc3d\" (UID: \"e8990647-b688-4c7d-acfa-c1287965cc3d\") " Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.754449 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e7d5803-2a94-4be7-874e-59415a346d19-scripts\") pod \"8e7d5803-2a94-4be7-874e-59415a346d19\" (UID: \"8e7d5803-2a94-4be7-874e-59415a346d19\") " Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.754530 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-dns-svc\") pod \"e8990647-b688-4c7d-acfa-c1287965cc3d\" (UID: \"e8990647-b688-4c7d-acfa-c1287965cc3d\") " Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.754612 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx2ds\" (UniqueName: \"kubernetes.io/projected/8e7d5803-2a94-4be7-874e-59415a346d19-kube-api-access-cx2ds\") pod \"8e7d5803-2a94-4be7-874e-59415a346d19\" (UID: \"8e7d5803-2a94-4be7-874e-59415a346d19\") " Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.754657 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-ovsdbserver-nb\") pod \"e8990647-b688-4c7d-acfa-c1287965cc3d\" (UID: \"e8990647-b688-4c7d-acfa-c1287965cc3d\") " Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.754701 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5l7w\" (UniqueName: \"kubernetes.io/projected/e8990647-b688-4c7d-acfa-c1287965cc3d-kube-api-access-c5l7w\") pod \"e8990647-b688-4c7d-acfa-c1287965cc3d\" (UID: \"e8990647-b688-4c7d-acfa-c1287965cc3d\") " Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.754759 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e7d5803-2a94-4be7-874e-59415a346d19-combined-ca-bundle\") pod \"8e7d5803-2a94-4be7-874e-59415a346d19\" (UID: \"8e7d5803-2a94-4be7-874e-59415a346d19\") " Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.761939 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8990647-b688-4c7d-acfa-c1287965cc3d-kube-api-access-c5l7w" (OuterVolumeSpecName: "kube-api-access-c5l7w") pod "e8990647-b688-4c7d-acfa-c1287965cc3d" (UID: "e8990647-b688-4c7d-acfa-c1287965cc3d"). InnerVolumeSpecName "kube-api-access-c5l7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.764104 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e7d5803-2a94-4be7-874e-59415a346d19-scripts" (OuterVolumeSpecName: "scripts") pod "8e7d5803-2a94-4be7-874e-59415a346d19" (UID: "8e7d5803-2a94-4be7-874e-59415a346d19"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.765877 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e7d5803-2a94-4be7-874e-59415a346d19-kube-api-access-cx2ds" (OuterVolumeSpecName: "kube-api-access-cx2ds") pod "8e7d5803-2a94-4be7-874e-59415a346d19" (UID: "8e7d5803-2a94-4be7-874e-59415a346d19"). InnerVolumeSpecName "kube-api-access-cx2ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.796239 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e7d5803-2a94-4be7-874e-59415a346d19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e7d5803-2a94-4be7-874e-59415a346d19" (UID: "8e7d5803-2a94-4be7-874e-59415a346d19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.800325 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e7d5803-2a94-4be7-874e-59415a346d19-config-data" (OuterVolumeSpecName: "config-data") pod "8e7d5803-2a94-4be7-874e-59415a346d19" (UID: "8e7d5803-2a94-4be7-874e-59415a346d19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.812573 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e8990647-b688-4c7d-acfa-c1287965cc3d" (UID: "e8990647-b688-4c7d-acfa-c1287965cc3d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.819939 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e8990647-b688-4c7d-acfa-c1287965cc3d" (UID: "e8990647-b688-4c7d-acfa-c1287965cc3d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.819959 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e8990647-b688-4c7d-acfa-c1287965cc3d" (UID: "e8990647-b688-4c7d-acfa-c1287965cc3d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.824699 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-config" (OuterVolumeSpecName: "config") pod "e8990647-b688-4c7d-acfa-c1287965cc3d" (UID: "e8990647-b688-4c7d-acfa-c1287965cc3d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.858010 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.858058 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e7d5803-2a94-4be7-874e-59415a346d19-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.858071 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.858085 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e7d5803-2a94-4be7-874e-59415a346d19-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.858095 4952 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.858105 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx2ds\" (UniqueName: \"kubernetes.io/projected/8e7d5803-2a94-4be7-874e-59415a346d19-kube-api-access-cx2ds\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.858115 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8990647-b688-4c7d-acfa-c1287965cc3d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.858124 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5l7w\" (UniqueName: \"kubernetes.io/projected/e8990647-b688-4c7d-acfa-c1287965cc3d-kube-api-access-c5l7w\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.858133 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e7d5803-2a94-4be7-874e-59415a346d19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.936912 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-k62bf" event={"ID":"ec9f0941-6a38-4fc6-bb6c-bdc23d78a279","Type":"ContainerDied","Data":"07212f47953784b4676bbbc43a531b5b41424adde36bda703b29c3e10641dbe0"} Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.936959 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07212f47953784b4676bbbc43a531b5b41424adde36bda703b29c3e10641dbe0" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.937038 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-k62bf" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.943255 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ng2bq" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.943279 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ng2bq" event={"ID":"8e7d5803-2a94-4be7-874e-59415a346d19","Type":"ContainerDied","Data":"7addce2bc7dcd885f7641549060bb1f773275dd1d66e320cf2c77a573c2a636f"} Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.943331 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7addce2bc7dcd885f7641549060bb1f773275dd1d66e320cf2c77a573c2a636f" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.946164 4952 generic.go:334] "Generic (PLEG): container finished" podID="e8990647-b688-4c7d-acfa-c1287965cc3d" containerID="f735e92f8358d3f5f801545fbc4d52dc6357804eb16e8398ceffe500e44d7711" exitCode=0 Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.946366 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" event={"ID":"e8990647-b688-4c7d-acfa-c1287965cc3d","Type":"ContainerDied","Data":"f735e92f8358d3f5f801545fbc4d52dc6357804eb16e8398ceffe500e44d7711"} Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.946417 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" event={"ID":"e8990647-b688-4c7d-acfa-c1287965cc3d","Type":"ContainerDied","Data":"e32d7c8708ea591c95d4533b9c171de99de6d34a630ec51a502cc64ad542c354"} Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.946425 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-zxqbc" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.946446 4952 scope.go:117] "RemoveContainer" containerID="f735e92f8358d3f5f801545fbc4d52dc6357804eb16e8398ceffe500e44d7711" Nov 22 03:12:46 crc kubenswrapper[4952]: I1122 03:12:46.995758 4952 scope.go:117] "RemoveContainer" containerID="97056bb7e99ad605610567594885ab13d6994809d9e18faee8fc928139df8768" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.060164 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-zxqbc"] Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.072821 4952 scope.go:117] "RemoveContainer" containerID="f735e92f8358d3f5f801545fbc4d52dc6357804eb16e8398ceffe500e44d7711" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.105991 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-zxqbc"] Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.137674 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 03:12:47 crc kubenswrapper[4952]: E1122 03:12:47.138204 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8990647-b688-4c7d-acfa-c1287965cc3d" containerName="init" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.138219 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8990647-b688-4c7d-acfa-c1287965cc3d" containerName="init" Nov 22 03:12:47 crc kubenswrapper[4952]: E1122 03:12:47.138246 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec9f0941-6a38-4fc6-bb6c-bdc23d78a279" containerName="nova-cell1-conductor-db-sync" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.138253 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec9f0941-6a38-4fc6-bb6c-bdc23d78a279" containerName="nova-cell1-conductor-db-sync" Nov 22 03:12:47 crc kubenswrapper[4952]: E1122 03:12:47.138264 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e7d5803-2a94-4be7-874e-59415a346d19" containerName="nova-manage" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.138271 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e7d5803-2a94-4be7-874e-59415a346d19" containerName="nova-manage" Nov 22 03:12:47 crc kubenswrapper[4952]: E1122 03:12:47.138292 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8990647-b688-4c7d-acfa-c1287965cc3d" containerName="dnsmasq-dns" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.138297 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8990647-b688-4c7d-acfa-c1287965cc3d" containerName="dnsmasq-dns" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.138480 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e7d5803-2a94-4be7-874e-59415a346d19" containerName="nova-manage" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.138499 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec9f0941-6a38-4fc6-bb6c-bdc23d78a279" containerName="nova-cell1-conductor-db-sync" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.138511 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8990647-b688-4c7d-acfa-c1287965cc3d" containerName="dnsmasq-dns" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.139345 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 03:12:47 crc kubenswrapper[4952]: E1122 03:12:47.144315 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f735e92f8358d3f5f801545fbc4d52dc6357804eb16e8398ceffe500e44d7711\": container with ID starting with f735e92f8358d3f5f801545fbc4d52dc6357804eb16e8398ceffe500e44d7711 not found: ID does not exist" containerID="f735e92f8358d3f5f801545fbc4d52dc6357804eb16e8398ceffe500e44d7711" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.144353 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f735e92f8358d3f5f801545fbc4d52dc6357804eb16e8398ceffe500e44d7711"} err="failed to get container status \"f735e92f8358d3f5f801545fbc4d52dc6357804eb16e8398ceffe500e44d7711\": rpc error: code = NotFound desc = could not find container \"f735e92f8358d3f5f801545fbc4d52dc6357804eb16e8398ceffe500e44d7711\": container with ID starting with f735e92f8358d3f5f801545fbc4d52dc6357804eb16e8398ceffe500e44d7711 not found: ID does not exist" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.144384 4952 scope.go:117] "RemoveContainer" containerID="97056bb7e99ad605610567594885ab13d6994809d9e18faee8fc928139df8768" Nov 22 03:12:47 crc kubenswrapper[4952]: E1122 03:12:47.149212 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97056bb7e99ad605610567594885ab13d6994809d9e18faee8fc928139df8768\": container with ID starting with 97056bb7e99ad605610567594885ab13d6994809d9e18faee8fc928139df8768 not found: ID does not exist" containerID="97056bb7e99ad605610567594885ab13d6994809d9e18faee8fc928139df8768" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.149259 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97056bb7e99ad605610567594885ab13d6994809d9e18faee8fc928139df8768"} err="failed to get container status \"97056bb7e99ad605610567594885ab13d6994809d9e18faee8fc928139df8768\": rpc error: code = NotFound desc = could not find container \"97056bb7e99ad605610567594885ab13d6994809d9e18faee8fc928139df8768\": container with ID starting with 97056bb7e99ad605610567594885ab13d6994809d9e18faee8fc928139df8768 not found: ID does not exist" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.149325 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.174166 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.200971 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.201426 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d4288052-cfd5-44a1-b156-06a7ee436d82" containerName="nova-api-log" containerID="cri-o://842c96526b437073cef231b947fb269736331f7dee1711886023affda9c4b430" gracePeriod=30 Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.201960 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d4288052-cfd5-44a1-b156-06a7ee436d82" containerName="nova-api-api" containerID="cri-o://375056e0277bb11196aae00a2a8b965456a69a78f4d67fe129dd5ebcb7ef9e72" gracePeriod=30 Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.221218 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.258924 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.259437 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856" containerName="nova-metadata-metadata" containerID="cri-o://d67faeaebcedbb438161d872adc255bd7c3bfe826917cae0765875093146cdd8" gracePeriod=30 Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.259359 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856" containerName="nova-metadata-log" containerID="cri-o://9b2a297f683c8c29c8c99b6d4633a3b1557340ea0a02f913847493b4aa21c3be" gracePeriod=30 Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.265150 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.265205 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.274605 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf38f2c-d0ed-4724-b542-fe296d6d6466-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cdf38f2c-d0ed-4724-b542-fe296d6d6466\") " pod="openstack/nova-cell1-conductor-0" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.274773 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf38f2c-d0ed-4724-b542-fe296d6d6466-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cdf38f2c-d0ed-4724-b542-fe296d6d6466\") " pod="openstack/nova-cell1-conductor-0" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.274998 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptvl6\" (UniqueName: \"kubernetes.io/projected/cdf38f2c-d0ed-4724-b542-fe296d6d6466-kube-api-access-ptvl6\") pod \"nova-cell1-conductor-0\" (UID: \"cdf38f2c-d0ed-4724-b542-fe296d6d6466\") " pod="openstack/nova-cell1-conductor-0" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.377497 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptvl6\" (UniqueName: \"kubernetes.io/projected/cdf38f2c-d0ed-4724-b542-fe296d6d6466-kube-api-access-ptvl6\") pod \"nova-cell1-conductor-0\" (UID: \"cdf38f2c-d0ed-4724-b542-fe296d6d6466\") " pod="openstack/nova-cell1-conductor-0" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.377654 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf38f2c-d0ed-4724-b542-fe296d6d6466-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cdf38f2c-d0ed-4724-b542-fe296d6d6466\") " pod="openstack/nova-cell1-conductor-0" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.377708 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf38f2c-d0ed-4724-b542-fe296d6d6466-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cdf38f2c-d0ed-4724-b542-fe296d6d6466\") " pod="openstack/nova-cell1-conductor-0" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.387745 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf38f2c-d0ed-4724-b542-fe296d6d6466-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cdf38f2c-d0ed-4724-b542-fe296d6d6466\") " pod="openstack/nova-cell1-conductor-0" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.388291 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf38f2c-d0ed-4724-b542-fe296d6d6466-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cdf38f2c-d0ed-4724-b542-fe296d6d6466\") " pod="openstack/nova-cell1-conductor-0" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.395956 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptvl6\" (UniqueName: \"kubernetes.io/projected/cdf38f2c-d0ed-4724-b542-fe296d6d6466-kube-api-access-ptvl6\") pod \"nova-cell1-conductor-0\" (UID: \"cdf38f2c-d0ed-4724-b542-fe296d6d6466\") " pod="openstack/nova-cell1-conductor-0" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.471322 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.964101 4952 generic.go:334] "Generic (PLEG): container finished" podID="d4288052-cfd5-44a1-b156-06a7ee436d82" containerID="842c96526b437073cef231b947fb269736331f7dee1711886023affda9c4b430" exitCode=143 Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.964200 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d4288052-cfd5-44a1-b156-06a7ee436d82","Type":"ContainerDied","Data":"842c96526b437073cef231b947fb269736331f7dee1711886023affda9c4b430"} Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.968258 4952 generic.go:334] "Generic (PLEG): container finished" podID="8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856" containerID="d67faeaebcedbb438161d872adc255bd7c3bfe826917cae0765875093146cdd8" exitCode=0 Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.968280 4952 generic.go:334] "Generic (PLEG): container finished" podID="8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856" containerID="9b2a297f683c8c29c8c99b6d4633a3b1557340ea0a02f913847493b4aa21c3be" exitCode=143 Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.968622 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5513e5e6-3481-4787-9c6e-ead3418a2137" containerName="nova-scheduler-scheduler" containerID="cri-o://e0788b01d54b79ce2ee8ac9b649cd5c8a69830d38523c4406d04ef0b71fcce55" gracePeriod=30 Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.968721 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856","Type":"ContainerDied","Data":"d67faeaebcedbb438161d872adc255bd7c3bfe826917cae0765875093146cdd8"} Nov 22 03:12:47 crc kubenswrapper[4952]: I1122 03:12:47.968747 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856","Type":"ContainerDied","Data":"9b2a297f683c8c29c8c99b6d4633a3b1557340ea0a02f913847493b4aa21c3be"} Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.034417 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.075771 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.097726 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm6qt\" (UniqueName: \"kubernetes.io/projected/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-kube-api-access-pm6qt\") pod \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\" (UID: \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\") " Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.098026 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-config-data\") pod \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\" (UID: \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\") " Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.099063 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-combined-ca-bundle\") pod \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\" (UID: \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\") " Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.099171 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-logs\") pod \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\" (UID: \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\") " Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.099249 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-nova-metadata-tls-certs\") pod \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\" (UID: \"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856\") " Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.100745 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-logs" (OuterVolumeSpecName: "logs") pod "8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856" (UID: "8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.201754 4952 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.358908 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-kube-api-access-pm6qt" (OuterVolumeSpecName: "kube-api-access-pm6qt") pod "8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856" (UID: "8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856"). InnerVolumeSpecName "kube-api-access-pm6qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.366031 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856" (UID: "8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.367330 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-config-data" (OuterVolumeSpecName: "config-data") pod "8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856" (UID: "8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.394055 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856" (UID: "8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.406751 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.406786 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.406802 4952 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.406816 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm6qt\" (UniqueName: \"kubernetes.io/projected/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856-kube-api-access-pm6qt\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.545107 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8990647-b688-4c7d-acfa-c1287965cc3d" path="/var/lib/kubelet/pods/e8990647-b688-4c7d-acfa-c1287965cc3d/volumes" Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.982067 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856","Type":"ContainerDied","Data":"c0c05b09b24dcd0a7bbba2bce8e1898969347a573c66c70afd1b967d59bd6342"} Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.982090 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.982421 4952 scope.go:117] "RemoveContainer" containerID="d67faeaebcedbb438161d872adc255bd7c3bfe826917cae0765875093146cdd8" Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.985275 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cdf38f2c-d0ed-4724-b542-fe296d6d6466","Type":"ContainerStarted","Data":"03ed87757924c4d770c07fb6115cda20ff1c91bd85c270e5a6d6c4b17da68fa9"} Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.985349 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cdf38f2c-d0ed-4724-b542-fe296d6d6466","Type":"ContainerStarted","Data":"76c7ab59ecb4230595885412bf711b6c930002db93da612de38c62d5c8625a97"} Nov 22 03:12:48 crc kubenswrapper[4952]: I1122 03:12:48.985495 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.008333 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.015932 4952 scope.go:117] "RemoveContainer" containerID="9b2a297f683c8c29c8c99b6d4633a3b1557340ea0a02f913847493b4aa21c3be" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.019530 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.035507 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.035482455 podStartE2EDuration="2.035482455s" podCreationTimestamp="2025-11-22 03:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:12:49.025221952 +0000 UTC m=+1133.331239235" watchObservedRunningTime="2025-11-22 03:12:49.035482455 +0000 UTC m=+1133.341499728" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.056735 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:49 crc kubenswrapper[4952]: E1122 03:12:49.057207 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856" containerName="nova-metadata-log" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.057228 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856" containerName="nova-metadata-log" Nov 22 03:12:49 crc kubenswrapper[4952]: E1122 03:12:49.057251 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856" containerName="nova-metadata-metadata" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.057257 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856" containerName="nova-metadata-metadata" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.057438 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856" containerName="nova-metadata-log" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.057486 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856" containerName="nova-metadata-metadata" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.058534 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.062332 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.062728 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.090355 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.121927 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911e63ba-b5de-4e5f-80e9-d62822cb8bac-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\") " pod="openstack/nova-metadata-0" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.122011 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqcpl\" (UniqueName: \"kubernetes.io/projected/911e63ba-b5de-4e5f-80e9-d62822cb8bac-kube-api-access-dqcpl\") pod \"nova-metadata-0\" (UID: \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\") " pod="openstack/nova-metadata-0" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.122057 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/911e63ba-b5de-4e5f-80e9-d62822cb8bac-logs\") pod \"nova-metadata-0\" (UID: \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\") " pod="openstack/nova-metadata-0" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.122171 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/911e63ba-b5de-4e5f-80e9-d62822cb8bac-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\") " pod="openstack/nova-metadata-0" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.122212 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/911e63ba-b5de-4e5f-80e9-d62822cb8bac-config-data\") pod \"nova-metadata-0\" (UID: \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\") " pod="openstack/nova-metadata-0" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.224009 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911e63ba-b5de-4e5f-80e9-d62822cb8bac-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\") " pod="openstack/nova-metadata-0" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.224072 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqcpl\" (UniqueName: \"kubernetes.io/projected/911e63ba-b5de-4e5f-80e9-d62822cb8bac-kube-api-access-dqcpl\") pod \"nova-metadata-0\" (UID: \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\") " pod="openstack/nova-metadata-0" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.224105 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/911e63ba-b5de-4e5f-80e9-d62822cb8bac-logs\") pod \"nova-metadata-0\" (UID: \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\") " pod="openstack/nova-metadata-0" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.224231 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/911e63ba-b5de-4e5f-80e9-d62822cb8bac-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\") " pod="openstack/nova-metadata-0" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.224257 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/911e63ba-b5de-4e5f-80e9-d62822cb8bac-config-data\") pod \"nova-metadata-0\" (UID: \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\") " pod="openstack/nova-metadata-0" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.226121 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/911e63ba-b5de-4e5f-80e9-d62822cb8bac-logs\") pod \"nova-metadata-0\" (UID: \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\") " pod="openstack/nova-metadata-0" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.231556 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/911e63ba-b5de-4e5f-80e9-d62822cb8bac-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\") " pod="openstack/nova-metadata-0" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.232063 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/911e63ba-b5de-4e5f-80e9-d62822cb8bac-config-data\") pod \"nova-metadata-0\" (UID: \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\") " pod="openstack/nova-metadata-0" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.235440 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911e63ba-b5de-4e5f-80e9-d62822cb8bac-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\") " pod="openstack/nova-metadata-0" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.245888 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqcpl\" (UniqueName: \"kubernetes.io/projected/911e63ba-b5de-4e5f-80e9-d62822cb8bac-kube-api-access-dqcpl\") pod \"nova-metadata-0\" (UID: \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\") " pod="openstack/nova-metadata-0" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.409894 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.858534 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.968649 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:49 crc kubenswrapper[4952]: I1122 03:12:49.998761 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"911e63ba-b5de-4e5f-80e9-d62822cb8bac","Type":"ContainerStarted","Data":"de9563ac5f45702da09c74c63566ec9969ac179e252c06056b889a4fd41f220b"} Nov 22 03:12:50 crc kubenswrapper[4952]: E1122 03:12:50.424663 4952 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e0788b01d54b79ce2ee8ac9b649cd5c8a69830d38523c4406d04ef0b71fcce55" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 03:12:50 crc kubenswrapper[4952]: E1122 03:12:50.426963 4952 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e0788b01d54b79ce2ee8ac9b649cd5c8a69830d38523c4406d04ef0b71fcce55" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 03:12:50 crc kubenswrapper[4952]: E1122 03:12:50.428693 4952 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e0788b01d54b79ce2ee8ac9b649cd5c8a69830d38523c4406d04ef0b71fcce55" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 03:12:50 crc kubenswrapper[4952]: E1122 03:12:50.428761 4952 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5513e5e6-3481-4787-9c6e-ead3418a2137" containerName="nova-scheduler-scheduler" Nov 22 03:12:50 crc kubenswrapper[4952]: I1122 03:12:50.544622 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856" path="/var/lib/kubelet/pods/8cccc3ff-8b79-49a2-9d6a-e1eb75b4c856/volumes" Nov 22 03:12:51 crc kubenswrapper[4952]: I1122 03:12:51.016923 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"911e63ba-b5de-4e5f-80e9-d62822cb8bac","Type":"ContainerStarted","Data":"b7d7af8fd776c334b8aa5bc0f9d713749d822981b734fd714e856772dd46f4f4"} Nov 22 03:12:51 crc kubenswrapper[4952]: I1122 03:12:51.017012 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"911e63ba-b5de-4e5f-80e9-d62822cb8bac","Type":"ContainerStarted","Data":"f50e7157212607d3d7d4da489f11a3aea8a2cb79c9bdd09955d6d86f9ad62156"} Nov 22 03:12:51 crc kubenswrapper[4952]: I1122 03:12:51.043705 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.043669414 podStartE2EDuration="2.043669414s" podCreationTimestamp="2025-11-22 03:12:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:12:51.037083198 +0000 UTC m=+1135.343100481" watchObservedRunningTime="2025-11-22 03:12:51.043669414 +0000 UTC m=+1135.349686697" Nov 22 03:12:52 crc kubenswrapper[4952]: I1122 03:12:52.031486 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 03:12:52 crc kubenswrapper[4952]: I1122 03:12:52.046074 4952 generic.go:334] "Generic (PLEG): container finished" podID="5513e5e6-3481-4787-9c6e-ead3418a2137" containerID="e0788b01d54b79ce2ee8ac9b649cd5c8a69830d38523c4406d04ef0b71fcce55" exitCode=0 Nov 22 03:12:52 crc kubenswrapper[4952]: I1122 03:12:52.046219 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5513e5e6-3481-4787-9c6e-ead3418a2137","Type":"ContainerDied","Data":"e0788b01d54b79ce2ee8ac9b649cd5c8a69830d38523c4406d04ef0b71fcce55"} Nov 22 03:12:52 crc kubenswrapper[4952]: I1122 03:12:52.046339 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5513e5e6-3481-4787-9c6e-ead3418a2137","Type":"ContainerDied","Data":"ac230a931bb4f99b7deb864e202d02ca42a45c1a628d5d5544fdf5ac923b55ba"} Nov 22 03:12:52 crc kubenswrapper[4952]: I1122 03:12:52.046387 4952 scope.go:117] "RemoveContainer" containerID="e0788b01d54b79ce2ee8ac9b649cd5c8a69830d38523c4406d04ef0b71fcce55" Nov 22 03:12:52 crc kubenswrapper[4952]: I1122 03:12:52.066582 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm8l8\" (UniqueName: \"kubernetes.io/projected/5513e5e6-3481-4787-9c6e-ead3418a2137-kube-api-access-tm8l8\") pod \"5513e5e6-3481-4787-9c6e-ead3418a2137\" (UID: \"5513e5e6-3481-4787-9c6e-ead3418a2137\") " Nov 22 03:12:52 crc kubenswrapper[4952]: I1122 03:12:52.066776 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5513e5e6-3481-4787-9c6e-ead3418a2137-config-data\") pod \"5513e5e6-3481-4787-9c6e-ead3418a2137\" (UID: \"5513e5e6-3481-4787-9c6e-ead3418a2137\") " Nov 22 03:12:52 crc kubenswrapper[4952]: I1122 03:12:52.066820 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5513e5e6-3481-4787-9c6e-ead3418a2137-combined-ca-bundle\") pod \"5513e5e6-3481-4787-9c6e-ead3418a2137\" (UID: \"5513e5e6-3481-4787-9c6e-ead3418a2137\") " Nov 22 03:12:52 crc kubenswrapper[4952]: I1122 03:12:52.081995 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5513e5e6-3481-4787-9c6e-ead3418a2137-kube-api-access-tm8l8" (OuterVolumeSpecName: "kube-api-access-tm8l8") pod "5513e5e6-3481-4787-9c6e-ead3418a2137" (UID: "5513e5e6-3481-4787-9c6e-ead3418a2137"). InnerVolumeSpecName "kube-api-access-tm8l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:12:52 crc kubenswrapper[4952]: I1122 03:12:52.107917 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5513e5e6-3481-4787-9c6e-ead3418a2137-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5513e5e6-3481-4787-9c6e-ead3418a2137" (UID: "5513e5e6-3481-4787-9c6e-ead3418a2137"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:52 crc kubenswrapper[4952]: I1122 03:12:52.116523 4952 scope.go:117] "RemoveContainer" containerID="e0788b01d54b79ce2ee8ac9b649cd5c8a69830d38523c4406d04ef0b71fcce55" Nov 22 03:12:52 crc kubenswrapper[4952]: E1122 03:12:52.120772 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0788b01d54b79ce2ee8ac9b649cd5c8a69830d38523c4406d04ef0b71fcce55\": container with ID starting with e0788b01d54b79ce2ee8ac9b649cd5c8a69830d38523c4406d04ef0b71fcce55 not found: ID does not exist" containerID="e0788b01d54b79ce2ee8ac9b649cd5c8a69830d38523c4406d04ef0b71fcce55" Nov 22 03:12:52 crc kubenswrapper[4952]: I1122 03:12:52.120833 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0788b01d54b79ce2ee8ac9b649cd5c8a69830d38523c4406d04ef0b71fcce55"} err="failed to get container status \"e0788b01d54b79ce2ee8ac9b649cd5c8a69830d38523c4406d04ef0b71fcce55\": rpc error: code = NotFound desc = could not find container \"e0788b01d54b79ce2ee8ac9b649cd5c8a69830d38523c4406d04ef0b71fcce55\": container with ID starting with e0788b01d54b79ce2ee8ac9b649cd5c8a69830d38523c4406d04ef0b71fcce55 not found: ID does not exist" Nov 22 03:12:52 crc kubenswrapper[4952]: I1122 03:12:52.128638 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5513e5e6-3481-4787-9c6e-ead3418a2137-config-data" (OuterVolumeSpecName: "config-data") pod "5513e5e6-3481-4787-9c6e-ead3418a2137" (UID: "5513e5e6-3481-4787-9c6e-ead3418a2137"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:52 crc kubenswrapper[4952]: I1122 03:12:52.168748 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm8l8\" (UniqueName: \"kubernetes.io/projected/5513e5e6-3481-4787-9c6e-ead3418a2137-kube-api-access-tm8l8\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:52 crc kubenswrapper[4952]: I1122 03:12:52.168791 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5513e5e6-3481-4787-9c6e-ead3418a2137-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:52 crc kubenswrapper[4952]: I1122 03:12:52.168805 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5513e5e6-3481-4787-9c6e-ead3418a2137-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.057377 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.061804 4952 generic.go:334] "Generic (PLEG): container finished" podID="d4288052-cfd5-44a1-b156-06a7ee436d82" containerID="375056e0277bb11196aae00a2a8b965456a69a78f4d67fe129dd5ebcb7ef9e72" exitCode=0 Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.061865 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d4288052-cfd5-44a1-b156-06a7ee436d82","Type":"ContainerDied","Data":"375056e0277bb11196aae00a2a8b965456a69a78f4d67fe129dd5ebcb7ef9e72"} Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.061901 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d4288052-cfd5-44a1-b156-06a7ee436d82","Type":"ContainerDied","Data":"575b55a49e93f0893a9bdef6ffea3b1cc63d03d8b2c88cc5b7bec2c913f84fd0"} Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.061914 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="575b55a49e93f0893a9bdef6ffea3b1cc63d03d8b2c88cc5b7bec2c913f84fd0" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.095173 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.115409 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.124149 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.169735 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:12:53 crc kubenswrapper[4952]: E1122 03:12:53.170380 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4288052-cfd5-44a1-b156-06a7ee436d82" containerName="nova-api-api" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.170430 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4288052-cfd5-44a1-b156-06a7ee436d82" containerName="nova-api-api" Nov 22 03:12:53 crc kubenswrapper[4952]: E1122 03:12:53.170471 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5513e5e6-3481-4787-9c6e-ead3418a2137" containerName="nova-scheduler-scheduler" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.170480 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="5513e5e6-3481-4787-9c6e-ead3418a2137" containerName="nova-scheduler-scheduler" Nov 22 03:12:53 crc kubenswrapper[4952]: E1122 03:12:53.170496 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4288052-cfd5-44a1-b156-06a7ee436d82" containerName="nova-api-log" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.170505 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4288052-cfd5-44a1-b156-06a7ee436d82" containerName="nova-api-log" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.170770 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4288052-cfd5-44a1-b156-06a7ee436d82" containerName="nova-api-log" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.170790 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4288052-cfd5-44a1-b156-06a7ee436d82" containerName="nova-api-api" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.170804 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="5513e5e6-3481-4787-9c6e-ead3418a2137" containerName="nova-scheduler-scheduler" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.172669 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.175001 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.191092 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.291256 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lln7w\" (UniqueName: \"kubernetes.io/projected/d4288052-cfd5-44a1-b156-06a7ee436d82-kube-api-access-lln7w\") pod \"d4288052-cfd5-44a1-b156-06a7ee436d82\" (UID: \"d4288052-cfd5-44a1-b156-06a7ee436d82\") " Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.291409 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4288052-cfd5-44a1-b156-06a7ee436d82-config-data\") pod \"d4288052-cfd5-44a1-b156-06a7ee436d82\" (UID: \"d4288052-cfd5-44a1-b156-06a7ee436d82\") " Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.291465 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4288052-cfd5-44a1-b156-06a7ee436d82-logs\") pod \"d4288052-cfd5-44a1-b156-06a7ee436d82\" (UID: \"d4288052-cfd5-44a1-b156-06a7ee436d82\") " Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.291522 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4288052-cfd5-44a1-b156-06a7ee436d82-combined-ca-bundle\") pod \"d4288052-cfd5-44a1-b156-06a7ee436d82\" (UID: \"d4288052-cfd5-44a1-b156-06a7ee436d82\") " Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.291880 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7378007-7521-4913-bfba-431de1bc6b02-config-data\") pod \"nova-scheduler-0\" (UID: \"a7378007-7521-4913-bfba-431de1bc6b02\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.291946 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7378007-7521-4913-bfba-431de1bc6b02-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a7378007-7521-4913-bfba-431de1bc6b02\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.291983 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nmfq\" (UniqueName: \"kubernetes.io/projected/a7378007-7521-4913-bfba-431de1bc6b02-kube-api-access-5nmfq\") pod \"nova-scheduler-0\" (UID: \"a7378007-7521-4913-bfba-431de1bc6b02\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.293140 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4288052-cfd5-44a1-b156-06a7ee436d82-logs" (OuterVolumeSpecName: "logs") pod "d4288052-cfd5-44a1-b156-06a7ee436d82" (UID: "d4288052-cfd5-44a1-b156-06a7ee436d82"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.298227 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4288052-cfd5-44a1-b156-06a7ee436d82-kube-api-access-lln7w" (OuterVolumeSpecName: "kube-api-access-lln7w") pod "d4288052-cfd5-44a1-b156-06a7ee436d82" (UID: "d4288052-cfd5-44a1-b156-06a7ee436d82"). InnerVolumeSpecName "kube-api-access-lln7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.324729 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4288052-cfd5-44a1-b156-06a7ee436d82-config-data" (OuterVolumeSpecName: "config-data") pod "d4288052-cfd5-44a1-b156-06a7ee436d82" (UID: "d4288052-cfd5-44a1-b156-06a7ee436d82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.332633 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4288052-cfd5-44a1-b156-06a7ee436d82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4288052-cfd5-44a1-b156-06a7ee436d82" (UID: "d4288052-cfd5-44a1-b156-06a7ee436d82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.393835 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7378007-7521-4913-bfba-431de1bc6b02-config-data\") pod \"nova-scheduler-0\" (UID: \"a7378007-7521-4913-bfba-431de1bc6b02\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.394637 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7378007-7521-4913-bfba-431de1bc6b02-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a7378007-7521-4913-bfba-431de1bc6b02\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.394681 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nmfq\" (UniqueName: \"kubernetes.io/projected/a7378007-7521-4913-bfba-431de1bc6b02-kube-api-access-5nmfq\") pod \"nova-scheduler-0\" (UID: \"a7378007-7521-4913-bfba-431de1bc6b02\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.394915 4952 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4288052-cfd5-44a1-b156-06a7ee436d82-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.394934 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4288052-cfd5-44a1-b156-06a7ee436d82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.394956 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lln7w\" (UniqueName: \"kubernetes.io/projected/d4288052-cfd5-44a1-b156-06a7ee436d82-kube-api-access-lln7w\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.394968 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4288052-cfd5-44a1-b156-06a7ee436d82-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.398292 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7378007-7521-4913-bfba-431de1bc6b02-config-data\") pod \"nova-scheduler-0\" (UID: \"a7378007-7521-4913-bfba-431de1bc6b02\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.400336 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7378007-7521-4913-bfba-431de1bc6b02-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a7378007-7521-4913-bfba-431de1bc6b02\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.417652 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nmfq\" (UniqueName: \"kubernetes.io/projected/a7378007-7521-4913-bfba-431de1bc6b02-kube-api-access-5nmfq\") pod \"nova-scheduler-0\" (UID: \"a7378007-7521-4913-bfba-431de1bc6b02\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.489813 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 03:12:53 crc kubenswrapper[4952]: I1122 03:12:53.778258 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:12:53 crc kubenswrapper[4952]: W1122 03:12:53.781303 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7378007_7521_4913_bfba_431de1bc6b02.slice/crio-ac5444b657572d63aee63e1f3ca99de3555a8edb0815f9420b9e7a4a86233f75 WatchSource:0}: Error finding container ac5444b657572d63aee63e1f3ca99de3555a8edb0815f9420b9e7a4a86233f75: Status 404 returned error can't find the container with id ac5444b657572d63aee63e1f3ca99de3555a8edb0815f9420b9e7a4a86233f75 Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.075403 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a7378007-7521-4913-bfba-431de1bc6b02","Type":"ContainerStarted","Data":"ca993581dc2188e48b1ef3447f29477af021cff4789c4ef9cb16ebf9eb215905"} Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.075481 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a7378007-7521-4913-bfba-431de1bc6b02","Type":"ContainerStarted","Data":"ac5444b657572d63aee63e1f3ca99de3555a8edb0815f9420b9e7a4a86233f75"} Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.075429 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.119060 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.119025631 podStartE2EDuration="1.119025631s" podCreationTimestamp="2025-11-22 03:12:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:12:54.098417821 +0000 UTC m=+1138.404435094" watchObservedRunningTime="2025-11-22 03:12:54.119025631 +0000 UTC m=+1138.425042934" Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.140433 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.153002 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.161768 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.164047 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.167640 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.171822 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.231723 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-config-data\") pod \"nova-api-0\" (UID: \"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5\") " pod="openstack/nova-api-0" Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.231822 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tf6z\" (UniqueName: \"kubernetes.io/projected/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-kube-api-access-7tf6z\") pod \"nova-api-0\" (UID: \"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5\") " pod="openstack/nova-api-0" Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.231872 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5\") " pod="openstack/nova-api-0" Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.231945 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-logs\") pod \"nova-api-0\" (UID: \"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5\") " pod="openstack/nova-api-0" Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.333775 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-config-data\") pod \"nova-api-0\" (UID: \"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5\") " pod="openstack/nova-api-0" Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.334190 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tf6z\" (UniqueName: \"kubernetes.io/projected/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-kube-api-access-7tf6z\") pod \"nova-api-0\" (UID: \"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5\") " pod="openstack/nova-api-0" Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.334386 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5\") " pod="openstack/nova-api-0" Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.335218 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-logs\") pod \"nova-api-0\" (UID: \"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5\") " pod="openstack/nova-api-0" Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.335658 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-logs\") pod \"nova-api-0\" (UID: \"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5\") " pod="openstack/nova-api-0" Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.341389 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5\") " pod="openstack/nova-api-0" Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.342705 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-config-data\") pod \"nova-api-0\" (UID: \"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5\") " pod="openstack/nova-api-0" Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.354403 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tf6z\" (UniqueName: \"kubernetes.io/projected/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-kube-api-access-7tf6z\") pod \"nova-api-0\" (UID: \"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5\") " pod="openstack/nova-api-0" Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.411396 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.411466 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.511916 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.544889 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5513e5e6-3481-4787-9c6e-ead3418a2137" path="/var/lib/kubelet/pods/5513e5e6-3481-4787-9c6e-ead3418a2137/volumes" Nov 22 03:12:54 crc kubenswrapper[4952]: I1122 03:12:54.545473 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4288052-cfd5-44a1-b156-06a7ee436d82" path="/var/lib/kubelet/pods/d4288052-cfd5-44a1-b156-06a7ee436d82/volumes" Nov 22 03:12:55 crc kubenswrapper[4952]: I1122 03:12:55.012440 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:55 crc kubenswrapper[4952]: W1122 03:12:55.016535 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4d84fa1_deff_4ccd_bdfd_12b6e3bfcaf5.slice/crio-4d9e983041bbde697394d9a55422c0dd875e132f1834e9a60e8c3c8b7e4328b0 WatchSource:0}: Error finding container 4d9e983041bbde697394d9a55422c0dd875e132f1834e9a60e8c3c8b7e4328b0: Status 404 returned error can't find the container with id 4d9e983041bbde697394d9a55422c0dd875e132f1834e9a60e8c3c8b7e4328b0 Nov 22 03:12:55 crc kubenswrapper[4952]: I1122 03:12:55.090836 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5","Type":"ContainerStarted","Data":"4d9e983041bbde697394d9a55422c0dd875e132f1834e9a60e8c3c8b7e4328b0"} Nov 22 03:12:56 crc kubenswrapper[4952]: I1122 03:12:56.110143 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5","Type":"ContainerStarted","Data":"95d1b30f7a155c3caff982d7a350e90763a124347b862743eabcfdb8cf6213bb"} Nov 22 03:12:56 crc kubenswrapper[4952]: I1122 03:12:56.111259 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5","Type":"ContainerStarted","Data":"a342758114eefc255574039f5243d4a550d8de532e043f47c38a578d61008974"} Nov 22 03:12:56 crc kubenswrapper[4952]: I1122 03:12:56.141372 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.141344507 podStartE2EDuration="2.141344507s" podCreationTimestamp="2025-11-22 03:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:12:56.139741934 +0000 UTC m=+1140.445759227" watchObservedRunningTime="2025-11-22 03:12:56.141344507 +0000 UTC m=+1140.447361820" Nov 22 03:12:57 crc kubenswrapper[4952]: I1122 03:12:57.510251 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 22 03:12:58 crc kubenswrapper[4952]: I1122 03:12:58.342487 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:12:58 crc kubenswrapper[4952]: I1122 03:12:58.342623 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:12:58 crc kubenswrapper[4952]: I1122 03:12:58.342682 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 03:12:58 crc kubenswrapper[4952]: I1122 03:12:58.343749 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f9d8e3cfbebc6d3bc61b04b622504062503fa5b2938cf86cbe1187a9e089f5b5"} pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:12:58 crc kubenswrapper[4952]: I1122 03:12:58.343830 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" containerID="cri-o://f9d8e3cfbebc6d3bc61b04b622504062503fa5b2938cf86cbe1187a9e089f5b5" gracePeriod=600 Nov 22 03:12:58 crc kubenswrapper[4952]: I1122 03:12:58.490496 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 22 03:12:59 crc kubenswrapper[4952]: I1122 03:12:59.155708 4952 generic.go:334] "Generic (PLEG): container finished" podID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerID="f9d8e3cfbebc6d3bc61b04b622504062503fa5b2938cf86cbe1187a9e089f5b5" exitCode=0 Nov 22 03:12:59 crc kubenswrapper[4952]: I1122 03:12:59.155893 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerDied","Data":"f9d8e3cfbebc6d3bc61b04b622504062503fa5b2938cf86cbe1187a9e089f5b5"} Nov 22 03:12:59 crc kubenswrapper[4952]: I1122 03:12:59.156185 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"15846c38005a8395c19c26d63eb9f008cd0288cc544d3ca54c338b089d4cf1e5"} Nov 22 03:12:59 crc kubenswrapper[4952]: I1122 03:12:59.156224 4952 scope.go:117] "RemoveContainer" containerID="9abb162c6e80f1a9b9ed3e044dff4a6d18eb9dcfbe293208b96a0a02169b6b19" Nov 22 03:12:59 crc kubenswrapper[4952]: I1122 03:12:59.411084 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 03:12:59 crc kubenswrapper[4952]: I1122 03:12:59.411840 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 03:13:00 crc kubenswrapper[4952]: I1122 03:13:00.425884 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="911e63ba-b5de-4e5f-80e9-d62822cb8bac" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 03:13:00 crc kubenswrapper[4952]: I1122 03:13:00.425974 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="911e63ba-b5de-4e5f-80e9-d62822cb8bac" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 03:13:03 crc kubenswrapper[4952]: I1122 03:13:03.490623 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 22 03:13:03 crc kubenswrapper[4952]: I1122 03:13:03.535373 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 22 03:13:04 crc kubenswrapper[4952]: I1122 03:13:04.276042 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 22 03:13:04 crc kubenswrapper[4952]: I1122 03:13:04.512427 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 03:13:04 crc kubenswrapper[4952]: I1122 03:13:04.512505 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 03:13:05 crc kubenswrapper[4952]: I1122 03:13:05.597884 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.179:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 03:13:05 crc kubenswrapper[4952]: I1122 03:13:05.597910 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.179:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 03:13:09 crc kubenswrapper[4952]: I1122 03:13:09.419789 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 03:13:09 crc kubenswrapper[4952]: I1122 03:13:09.421666 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 03:13:09 crc kubenswrapper[4952]: I1122 03:13:09.430468 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 03:13:10 crc kubenswrapper[4952]: I1122 03:13:10.296819 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.229745 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.306041 4952 generic.go:334] "Generic (PLEG): container finished" podID="f6f78980-48c3-49d7-8127-6d06c53df6f8" containerID="b78e8050001c1e87e21a621dfeafce0e5e2d03698b7153634d2931731add067c" exitCode=137 Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.306158 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f6f78980-48c3-49d7-8127-6d06c53df6f8","Type":"ContainerDied","Data":"b78e8050001c1e87e21a621dfeafce0e5e2d03698b7153634d2931731add067c"} Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.306213 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.306259 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f6f78980-48c3-49d7-8127-6d06c53df6f8","Type":"ContainerDied","Data":"2867ba8ca66a110d2d49fa497c5ff5320078596fa78a357ffb246f66ebeba14b"} Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.306307 4952 scope.go:117] "RemoveContainer" containerID="b78e8050001c1e87e21a621dfeafce0e5e2d03698b7153634d2931731add067c" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.334089 4952 scope.go:117] "RemoveContainer" containerID="b78e8050001c1e87e21a621dfeafce0e5e2d03698b7153634d2931731add067c" Nov 22 03:13:11 crc kubenswrapper[4952]: E1122 03:13:11.334902 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b78e8050001c1e87e21a621dfeafce0e5e2d03698b7153634d2931731add067c\": container with ID starting with b78e8050001c1e87e21a621dfeafce0e5e2d03698b7153634d2931731add067c not found: ID does not exist" containerID="b78e8050001c1e87e21a621dfeafce0e5e2d03698b7153634d2931731add067c" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.335031 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b78e8050001c1e87e21a621dfeafce0e5e2d03698b7153634d2931731add067c"} err="failed to get container status \"b78e8050001c1e87e21a621dfeafce0e5e2d03698b7153634d2931731add067c\": rpc error: code = NotFound desc = could not find container \"b78e8050001c1e87e21a621dfeafce0e5e2d03698b7153634d2931731add067c\": container with ID starting with b78e8050001c1e87e21a621dfeafce0e5e2d03698b7153634d2931731add067c not found: ID does not exist" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.383912 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6f78980-48c3-49d7-8127-6d06c53df6f8-config-data\") pod \"f6f78980-48c3-49d7-8127-6d06c53df6f8\" (UID: \"f6f78980-48c3-49d7-8127-6d06c53df6f8\") " Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.384046 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6f78980-48c3-49d7-8127-6d06c53df6f8-combined-ca-bundle\") pod \"f6f78980-48c3-49d7-8127-6d06c53df6f8\" (UID: \"f6f78980-48c3-49d7-8127-6d06c53df6f8\") " Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.384194 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2ckl\" (UniqueName: \"kubernetes.io/projected/f6f78980-48c3-49d7-8127-6d06c53df6f8-kube-api-access-r2ckl\") pod \"f6f78980-48c3-49d7-8127-6d06c53df6f8\" (UID: \"f6f78980-48c3-49d7-8127-6d06c53df6f8\") " Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.392715 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6f78980-48c3-49d7-8127-6d06c53df6f8-kube-api-access-r2ckl" (OuterVolumeSpecName: "kube-api-access-r2ckl") pod "f6f78980-48c3-49d7-8127-6d06c53df6f8" (UID: "f6f78980-48c3-49d7-8127-6d06c53df6f8"). InnerVolumeSpecName "kube-api-access-r2ckl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.420370 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6f78980-48c3-49d7-8127-6d06c53df6f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6f78980-48c3-49d7-8127-6d06c53df6f8" (UID: "f6f78980-48c3-49d7-8127-6d06c53df6f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.421591 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6f78980-48c3-49d7-8127-6d06c53df6f8-config-data" (OuterVolumeSpecName: "config-data") pod "f6f78980-48c3-49d7-8127-6d06c53df6f8" (UID: "f6f78980-48c3-49d7-8127-6d06c53df6f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.487358 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6f78980-48c3-49d7-8127-6d06c53df6f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.487720 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2ckl\" (UniqueName: \"kubernetes.io/projected/f6f78980-48c3-49d7-8127-6d06c53df6f8-kube-api-access-r2ckl\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.487854 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6f78980-48c3-49d7-8127-6d06c53df6f8-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.712104 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.725031 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.740440 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 03:13:11 crc kubenswrapper[4952]: E1122 03:13:11.741327 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f78980-48c3-49d7-8127-6d06c53df6f8" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.741600 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f78980-48c3-49d7-8127-6d06c53df6f8" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.741930 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6f78980-48c3-49d7-8127-6d06c53df6f8" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.744106 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.745057 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.746762 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.747379 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.747455 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.796094 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a83aed-143b-40e4-a06a-7452102935c2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5a83aed-143b-40e4-a06a-7452102935c2\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.796440 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a83aed-143b-40e4-a06a-7452102935c2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5a83aed-143b-40e4-a06a-7452102935c2\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.796571 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr8xm\" (UniqueName: \"kubernetes.io/projected/b5a83aed-143b-40e4-a06a-7452102935c2-kube-api-access-dr8xm\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5a83aed-143b-40e4-a06a-7452102935c2\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.796700 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a83aed-143b-40e4-a06a-7452102935c2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5a83aed-143b-40e4-a06a-7452102935c2\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.796858 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a83aed-143b-40e4-a06a-7452102935c2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5a83aed-143b-40e4-a06a-7452102935c2\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.898481 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a83aed-143b-40e4-a06a-7452102935c2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5a83aed-143b-40e4-a06a-7452102935c2\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.899198 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr8xm\" (UniqueName: \"kubernetes.io/projected/b5a83aed-143b-40e4-a06a-7452102935c2-kube-api-access-dr8xm\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5a83aed-143b-40e4-a06a-7452102935c2\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.899461 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a83aed-143b-40e4-a06a-7452102935c2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5a83aed-143b-40e4-a06a-7452102935c2\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.900045 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a83aed-143b-40e4-a06a-7452102935c2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5a83aed-143b-40e4-a06a-7452102935c2\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.900320 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a83aed-143b-40e4-a06a-7452102935c2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5a83aed-143b-40e4-a06a-7452102935c2\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.903421 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a83aed-143b-40e4-a06a-7452102935c2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5a83aed-143b-40e4-a06a-7452102935c2\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.904093 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a83aed-143b-40e4-a06a-7452102935c2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5a83aed-143b-40e4-a06a-7452102935c2\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.905206 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a83aed-143b-40e4-a06a-7452102935c2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5a83aed-143b-40e4-a06a-7452102935c2\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.907044 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a83aed-143b-40e4-a06a-7452102935c2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5a83aed-143b-40e4-a06a-7452102935c2\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:11 crc kubenswrapper[4952]: I1122 03:13:11.921837 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr8xm\" (UniqueName: \"kubernetes.io/projected/b5a83aed-143b-40e4-a06a-7452102935c2-kube-api-access-dr8xm\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5a83aed-143b-40e4-a06a-7452102935c2\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:12 crc kubenswrapper[4952]: I1122 03:13:12.067767 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:12 crc kubenswrapper[4952]: I1122 03:13:12.562720 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6f78980-48c3-49d7-8127-6d06c53df6f8" path="/var/lib/kubelet/pods/f6f78980-48c3-49d7-8127-6d06c53df6f8/volumes" Nov 22 03:13:12 crc kubenswrapper[4952]: I1122 03:13:12.589737 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 03:13:12 crc kubenswrapper[4952]: W1122 03:13:12.594012 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5a83aed_143b_40e4_a06a_7452102935c2.slice/crio-88c90dd1b045d21f3a39a9d0c7625df54e981794b76b31d673211acf84ecfbf4 WatchSource:0}: Error finding container 88c90dd1b045d21f3a39a9d0c7625df54e981794b76b31d673211acf84ecfbf4: Status 404 returned error can't find the container with id 88c90dd1b045d21f3a39a9d0c7625df54e981794b76b31d673211acf84ecfbf4 Nov 22 03:13:13 crc kubenswrapper[4952]: I1122 03:13:13.338350 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b5a83aed-143b-40e4-a06a-7452102935c2","Type":"ContainerStarted","Data":"794b16ce5a8ad8c709697dffb548d452c73800f53a3eb8e3d1353017f7734d95"} Nov 22 03:13:13 crc kubenswrapper[4952]: I1122 03:13:13.338996 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b5a83aed-143b-40e4-a06a-7452102935c2","Type":"ContainerStarted","Data":"88c90dd1b045d21f3a39a9d0c7625df54e981794b76b31d673211acf84ecfbf4"} Nov 22 03:13:13 crc kubenswrapper[4952]: I1122 03:13:13.381797 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.381761549 podStartE2EDuration="2.381761549s" podCreationTimestamp="2025-11-22 03:13:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:13:13.367219331 +0000 UTC m=+1157.673236654" watchObservedRunningTime="2025-11-22 03:13:13.381761549 +0000 UTC m=+1157.687778832" Nov 22 03:13:14 crc kubenswrapper[4952]: I1122 03:13:14.519913 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 03:13:14 crc kubenswrapper[4952]: I1122 03:13:14.520066 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 03:13:14 crc kubenswrapper[4952]: I1122 03:13:14.520753 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 03:13:14 crc kubenswrapper[4952]: I1122 03:13:14.520829 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 03:13:14 crc kubenswrapper[4952]: I1122 03:13:14.528206 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 03:13:14 crc kubenswrapper[4952]: I1122 03:13:14.529914 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 03:13:14 crc kubenswrapper[4952]: I1122 03:13:14.848850 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-pz9jv"] Nov 22 03:13:14 crc kubenswrapper[4952]: I1122 03:13:14.854602 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:13:14 crc kubenswrapper[4952]: I1122 03:13:14.869364 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-pz9jv"] Nov 22 03:13:14 crc kubenswrapper[4952]: I1122 03:13:14.979127 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-config\") pod \"dnsmasq-dns-5b856c5697-pz9jv\" (UID: \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\") " pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:13:14 crc kubenswrapper[4952]: I1122 03:13:14.979202 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-dns-svc\") pod \"dnsmasq-dns-5b856c5697-pz9jv\" (UID: \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\") " pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:13:14 crc kubenswrapper[4952]: I1122 03:13:14.979233 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-pz9jv\" (UID: \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\") " pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:13:14 crc kubenswrapper[4952]: I1122 03:13:14.979321 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-pz9jv\" (UID: \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\") " pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:13:14 crc kubenswrapper[4952]: I1122 03:13:14.979380 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2znr\" (UniqueName: \"kubernetes.io/projected/ce5b6b87-7965-4f3a-a929-c5495ef9176d-kube-api-access-b2znr\") pod \"dnsmasq-dns-5b856c5697-pz9jv\" (UID: \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\") " pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:13:15 crc kubenswrapper[4952]: I1122 03:13:15.081993 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-pz9jv\" (UID: \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\") " pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:13:15 crc kubenswrapper[4952]: I1122 03:13:15.082090 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2znr\" (UniqueName: \"kubernetes.io/projected/ce5b6b87-7965-4f3a-a929-c5495ef9176d-kube-api-access-b2znr\") pod \"dnsmasq-dns-5b856c5697-pz9jv\" (UID: \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\") " pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:13:15 crc kubenswrapper[4952]: I1122 03:13:15.082156 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-config\") pod \"dnsmasq-dns-5b856c5697-pz9jv\" (UID: \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\") " pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:13:15 crc kubenswrapper[4952]: I1122 03:13:15.082183 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-dns-svc\") pod \"dnsmasq-dns-5b856c5697-pz9jv\" (UID: \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\") " pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:13:15 crc kubenswrapper[4952]: I1122 03:13:15.082206 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-pz9jv\" (UID: \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\") " pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:13:15 crc kubenswrapper[4952]: I1122 03:13:15.083307 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-pz9jv\" (UID: \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\") " pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:13:15 crc kubenswrapper[4952]: I1122 03:13:15.091285 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-config\") pod \"dnsmasq-dns-5b856c5697-pz9jv\" (UID: \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\") " pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:13:15 crc kubenswrapper[4952]: I1122 03:13:15.094115 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-pz9jv\" (UID: \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\") " pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:13:15 crc kubenswrapper[4952]: I1122 03:13:15.102306 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-dns-svc\") pod \"dnsmasq-dns-5b856c5697-pz9jv\" (UID: \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\") " pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:13:15 crc kubenswrapper[4952]: I1122 03:13:15.133443 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2znr\" (UniqueName: \"kubernetes.io/projected/ce5b6b87-7965-4f3a-a929-c5495ef9176d-kube-api-access-b2znr\") pod \"dnsmasq-dns-5b856c5697-pz9jv\" (UID: \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\") " pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:13:15 crc kubenswrapper[4952]: I1122 03:13:15.203906 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:13:15 crc kubenswrapper[4952]: I1122 03:13:15.689100 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-pz9jv"] Nov 22 03:13:16 crc kubenswrapper[4952]: I1122 03:13:16.384154 4952 generic.go:334] "Generic (PLEG): container finished" podID="ce5b6b87-7965-4f3a-a929-c5495ef9176d" containerID="69d90bbc02f8f196aaf373fa7bb2eeb1ee2267869e35b54da8c20595872feae8" exitCode=0 Nov 22 03:13:16 crc kubenswrapper[4952]: I1122 03:13:16.384367 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" event={"ID":"ce5b6b87-7965-4f3a-a929-c5495ef9176d","Type":"ContainerDied","Data":"69d90bbc02f8f196aaf373fa7bb2eeb1ee2267869e35b54da8c20595872feae8"} Nov 22 03:13:16 crc kubenswrapper[4952]: I1122 03:13:16.384814 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" event={"ID":"ce5b6b87-7965-4f3a-a929-c5495ef9176d","Type":"ContainerStarted","Data":"41c91b7bf029e2f46f66e0b5ee84c862da7d71f1d96337b65ec15a24f8844ad0"} Nov 22 03:13:17 crc kubenswrapper[4952]: I1122 03:13:17.068414 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:17 crc kubenswrapper[4952]: I1122 03:13:17.105473 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:13:17 crc kubenswrapper[4952]: I1122 03:13:17.105932 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerName="ceilometer-central-agent" containerID="cri-o://26d422bb25945cef6dbb561acd23faacf28d498c7759ecdb7c5a48fb8ee05580" gracePeriod=30 Nov 22 03:13:17 crc kubenswrapper[4952]: I1122 03:13:17.106561 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerName="proxy-httpd" containerID="cri-o://4b0689697e29e11fcf3e3f10a63346d58bf9d4a396b4b6418ce3497a4f2051c5" gracePeriod=30 Nov 22 03:13:17 crc kubenswrapper[4952]: I1122 03:13:17.106920 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerName="ceilometer-notification-agent" containerID="cri-o://bc2911a64e9cfb6608e61a63851d398f3d0d0d8c55253df31ddba567f427e8de" gracePeriod=30 Nov 22 03:13:17 crc kubenswrapper[4952]: I1122 03:13:17.107003 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerName="sg-core" containerID="cri-o://f93931a3e3d40cb7867242189499222e47756455cf76ebb398df68c36797c365" gracePeriod=30 Nov 22 03:13:17 crc kubenswrapper[4952]: I1122 03:13:17.399775 4952 generic.go:334] "Generic (PLEG): container finished" podID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerID="4b0689697e29e11fcf3e3f10a63346d58bf9d4a396b4b6418ce3497a4f2051c5" exitCode=0 Nov 22 03:13:17 crc kubenswrapper[4952]: I1122 03:13:17.400234 4952 generic.go:334] "Generic (PLEG): container finished" podID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerID="f93931a3e3d40cb7867242189499222e47756455cf76ebb398df68c36797c365" exitCode=2 Nov 22 03:13:17 crc kubenswrapper[4952]: I1122 03:13:17.399860 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fd2de5e-879f-48ce-86d9-175baea81ab6","Type":"ContainerDied","Data":"4b0689697e29e11fcf3e3f10a63346d58bf9d4a396b4b6418ce3497a4f2051c5"} Nov 22 03:13:17 crc kubenswrapper[4952]: I1122 03:13:17.400347 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fd2de5e-879f-48ce-86d9-175baea81ab6","Type":"ContainerDied","Data":"f93931a3e3d40cb7867242189499222e47756455cf76ebb398df68c36797c365"} Nov 22 03:13:17 crc kubenswrapper[4952]: I1122 03:13:17.402617 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" event={"ID":"ce5b6b87-7965-4f3a-a929-c5495ef9176d","Type":"ContainerStarted","Data":"5926563b4d2036cb1cf10742c5a9c8782827f4f46fc0fc07e30654425cae110b"} Nov 22 03:13:17 crc kubenswrapper[4952]: I1122 03:13:17.402802 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:13:17 crc kubenswrapper[4952]: I1122 03:13:17.428600 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" podStartSLOduration=3.42857632 podStartE2EDuration="3.42857632s" podCreationTimestamp="2025-11-22 03:13:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:13:17.421670626 +0000 UTC m=+1161.727687909" watchObservedRunningTime="2025-11-22 03:13:17.42857632 +0000 UTC m=+1161.734593603" Nov 22 03:13:17 crc kubenswrapper[4952]: I1122 03:13:17.578528 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:13:17 crc kubenswrapper[4952]: I1122 03:13:17.578878 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5" containerName="nova-api-api" containerID="cri-o://95d1b30f7a155c3caff982d7a350e90763a124347b862743eabcfdb8cf6213bb" gracePeriod=30 Nov 22 03:13:17 crc kubenswrapper[4952]: I1122 03:13:17.578846 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5" containerName="nova-api-log" containerID="cri-o://a342758114eefc255574039f5243d4a550d8de532e043f47c38a578d61008974" gracePeriod=30 Nov 22 03:13:17 crc kubenswrapper[4952]: E1122 03:13:17.684015 4952 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4d84fa1_deff_4ccd_bdfd_12b6e3bfcaf5.slice/crio-a342758114eefc255574039f5243d4a550d8de532e043f47c38a578d61008974.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fd2de5e_879f_48ce_86d9_175baea81ab6.slice/crio-26d422bb25945cef6dbb561acd23faacf28d498c7759ecdb7c5a48fb8ee05580.scope\": RecentStats: unable to find data in memory cache]" Nov 22 03:13:18 crc kubenswrapper[4952]: I1122 03:13:18.416771 4952 generic.go:334] "Generic (PLEG): container finished" podID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerID="26d422bb25945cef6dbb561acd23faacf28d498c7759ecdb7c5a48fb8ee05580" exitCode=0 Nov 22 03:13:18 crc kubenswrapper[4952]: I1122 03:13:18.416818 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fd2de5e-879f-48ce-86d9-175baea81ab6","Type":"ContainerDied","Data":"26d422bb25945cef6dbb561acd23faacf28d498c7759ecdb7c5a48fb8ee05580"} Nov 22 03:13:18 crc kubenswrapper[4952]: I1122 03:13:18.420470 4952 generic.go:334] "Generic (PLEG): container finished" podID="a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5" containerID="a342758114eefc255574039f5243d4a550d8de532e043f47c38a578d61008974" exitCode=143 Nov 22 03:13:18 crc kubenswrapper[4952]: I1122 03:13:18.420575 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5","Type":"ContainerDied","Data":"a342758114eefc255574039f5243d4a550d8de532e043f47c38a578d61008974"} Nov 22 03:13:19 crc kubenswrapper[4952]: I1122 03:13:19.853426 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.166:3000/\": dial tcp 10.217.0.166:3000: connect: connection refused" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.069356 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.125332 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-ceilometer-tls-certs\") pod \"8fd2de5e-879f-48ce-86d9-175baea81ab6\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.125452 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fd2de5e-879f-48ce-86d9-175baea81ab6-run-httpd\") pod \"8fd2de5e-879f-48ce-86d9-175baea81ab6\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.125495 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-config-data\") pod \"8fd2de5e-879f-48ce-86d9-175baea81ab6\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.125593 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-scripts\") pod \"8fd2de5e-879f-48ce-86d9-175baea81ab6\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.125637 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8gkl\" (UniqueName: \"kubernetes.io/projected/8fd2de5e-879f-48ce-86d9-175baea81ab6-kube-api-access-m8gkl\") pod \"8fd2de5e-879f-48ce-86d9-175baea81ab6\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.125763 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-combined-ca-bundle\") pod \"8fd2de5e-879f-48ce-86d9-175baea81ab6\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.125792 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-sg-core-conf-yaml\") pod \"8fd2de5e-879f-48ce-86d9-175baea81ab6\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.125829 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fd2de5e-879f-48ce-86d9-175baea81ab6-log-httpd\") pod \"8fd2de5e-879f-48ce-86d9-175baea81ab6\" (UID: \"8fd2de5e-879f-48ce-86d9-175baea81ab6\") " Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.126346 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fd2de5e-879f-48ce-86d9-175baea81ab6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8fd2de5e-879f-48ce-86d9-175baea81ab6" (UID: "8fd2de5e-879f-48ce-86d9-175baea81ab6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.126664 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fd2de5e-879f-48ce-86d9-175baea81ab6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8fd2de5e-879f-48ce-86d9-175baea81ab6" (UID: "8fd2de5e-879f-48ce-86d9-175baea81ab6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.137377 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd2de5e-879f-48ce-86d9-175baea81ab6-kube-api-access-m8gkl" (OuterVolumeSpecName: "kube-api-access-m8gkl") pod "8fd2de5e-879f-48ce-86d9-175baea81ab6" (UID: "8fd2de5e-879f-48ce-86d9-175baea81ab6"). InnerVolumeSpecName "kube-api-access-m8gkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.155054 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-scripts" (OuterVolumeSpecName: "scripts") pod "8fd2de5e-879f-48ce-86d9-175baea81ab6" (UID: "8fd2de5e-879f-48ce-86d9-175baea81ab6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.198437 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8fd2de5e-879f-48ce-86d9-175baea81ab6" (UID: "8fd2de5e-879f-48ce-86d9-175baea81ab6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.222241 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.227807 4952 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.227843 4952 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fd2de5e-879f-48ce-86d9-175baea81ab6-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.227854 4952 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fd2de5e-879f-48ce-86d9-175baea81ab6-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.227866 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.227878 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8gkl\" (UniqueName: \"kubernetes.io/projected/8fd2de5e-879f-48ce-86d9-175baea81ab6-kube-api-access-m8gkl\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.279907 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8fd2de5e-879f-48ce-86d9-175baea81ab6" (UID: "8fd2de5e-879f-48ce-86d9-175baea81ab6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.311764 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-config-data" (OuterVolumeSpecName: "config-data") pod "8fd2de5e-879f-48ce-86d9-175baea81ab6" (UID: "8fd2de5e-879f-48ce-86d9-175baea81ab6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.315937 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fd2de5e-879f-48ce-86d9-175baea81ab6" (UID: "8fd2de5e-879f-48ce-86d9-175baea81ab6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.330374 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-logs\") pod \"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5\" (UID: \"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5\") " Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.330486 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-config-data\") pod \"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5\" (UID: \"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5\") " Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.330631 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-combined-ca-bundle\") pod \"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5\" (UID: \"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5\") " Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.330744 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tf6z\" (UniqueName: \"kubernetes.io/projected/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-kube-api-access-7tf6z\") pod \"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5\" (UID: \"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5\") " Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.331250 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-logs" (OuterVolumeSpecName: "logs") pod "a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5" (UID: "a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.331442 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.331467 4952 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.331478 4952 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.331489 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd2de5e-879f-48ce-86d9-175baea81ab6-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.333773 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-kube-api-access-7tf6z" (OuterVolumeSpecName: "kube-api-access-7tf6z") pod "a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5" (UID: "a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5"). InnerVolumeSpecName "kube-api-access-7tf6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.357121 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5" (UID: "a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.367867 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-config-data" (OuterVolumeSpecName: "config-data") pod "a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5" (UID: "a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.434287 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.434335 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.434352 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tf6z\" (UniqueName: \"kubernetes.io/projected/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5-kube-api-access-7tf6z\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.453571 4952 generic.go:334] "Generic (PLEG): container finished" podID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerID="bc2911a64e9cfb6608e61a63851d398f3d0d0d8c55253df31ddba567f427e8de" exitCode=0 Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.453680 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fd2de5e-879f-48ce-86d9-175baea81ab6","Type":"ContainerDied","Data":"bc2911a64e9cfb6608e61a63851d398f3d0d0d8c55253df31ddba567f427e8de"} Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.453671 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.453732 4952 scope.go:117] "RemoveContainer" containerID="4b0689697e29e11fcf3e3f10a63346d58bf9d4a396b4b6418ce3497a4f2051c5" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.453718 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fd2de5e-879f-48ce-86d9-175baea81ab6","Type":"ContainerDied","Data":"145ef0de9c48b62d154eea2d105601d826c14dc15c480352f135222ba4c2bd20"} Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.457048 4952 generic.go:334] "Generic (PLEG): container finished" podID="a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5" containerID="95d1b30f7a155c3caff982d7a350e90763a124347b862743eabcfdb8cf6213bb" exitCode=0 Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.457090 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5","Type":"ContainerDied","Data":"95d1b30f7a155c3caff982d7a350e90763a124347b862743eabcfdb8cf6213bb"} Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.457113 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5","Type":"ContainerDied","Data":"4d9e983041bbde697394d9a55422c0dd875e132f1834e9a60e8c3c8b7e4328b0"} Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.457188 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.483338 4952 scope.go:117] "RemoveContainer" containerID="f93931a3e3d40cb7867242189499222e47756455cf76ebb398df68c36797c365" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.502817 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.517526 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.521365 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.522160 4952 scope.go:117] "RemoveContainer" containerID="bc2911a64e9cfb6608e61a63851d398f3d0d0d8c55253df31ddba567f427e8de" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.554954 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.564970 4952 scope.go:117] "RemoveContainer" containerID="26d422bb25945cef6dbb561acd23faacf28d498c7759ecdb7c5a48fb8ee05580" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.600758 4952 scope.go:117] "RemoveContainer" containerID="4b0689697e29e11fcf3e3f10a63346d58bf9d4a396b4b6418ce3497a4f2051c5" Nov 22 03:13:21 crc kubenswrapper[4952]: E1122 03:13:21.602339 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b0689697e29e11fcf3e3f10a63346d58bf9d4a396b4b6418ce3497a4f2051c5\": container with ID starting with 4b0689697e29e11fcf3e3f10a63346d58bf9d4a396b4b6418ce3497a4f2051c5 not found: ID does not exist" containerID="4b0689697e29e11fcf3e3f10a63346d58bf9d4a396b4b6418ce3497a4f2051c5" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.602426 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b0689697e29e11fcf3e3f10a63346d58bf9d4a396b4b6418ce3497a4f2051c5"} err="failed to get container status \"4b0689697e29e11fcf3e3f10a63346d58bf9d4a396b4b6418ce3497a4f2051c5\": rpc error: code = NotFound desc = could not find container \"4b0689697e29e11fcf3e3f10a63346d58bf9d4a396b4b6418ce3497a4f2051c5\": container with ID starting with 4b0689697e29e11fcf3e3f10a63346d58bf9d4a396b4b6418ce3497a4f2051c5 not found: ID does not exist" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.603182 4952 scope.go:117] "RemoveContainer" containerID="f93931a3e3d40cb7867242189499222e47756455cf76ebb398df68c36797c365" Nov 22 03:13:21 crc kubenswrapper[4952]: E1122 03:13:21.603636 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f93931a3e3d40cb7867242189499222e47756455cf76ebb398df68c36797c365\": container with ID starting with f93931a3e3d40cb7867242189499222e47756455cf76ebb398df68c36797c365 not found: ID does not exist" containerID="f93931a3e3d40cb7867242189499222e47756455cf76ebb398df68c36797c365" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.603708 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f93931a3e3d40cb7867242189499222e47756455cf76ebb398df68c36797c365"} err="failed to get container status \"f93931a3e3d40cb7867242189499222e47756455cf76ebb398df68c36797c365\": rpc error: code = NotFound desc = could not find container \"f93931a3e3d40cb7867242189499222e47756455cf76ebb398df68c36797c365\": container with ID starting with f93931a3e3d40cb7867242189499222e47756455cf76ebb398df68c36797c365 not found: ID does not exist" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.603741 4952 scope.go:117] "RemoveContainer" containerID="bc2911a64e9cfb6608e61a63851d398f3d0d0d8c55253df31ddba567f427e8de" Nov 22 03:13:21 crc kubenswrapper[4952]: E1122 03:13:21.605236 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc2911a64e9cfb6608e61a63851d398f3d0d0d8c55253df31ddba567f427e8de\": container with ID starting with bc2911a64e9cfb6608e61a63851d398f3d0d0d8c55253df31ddba567f427e8de not found: ID does not exist" containerID="bc2911a64e9cfb6608e61a63851d398f3d0d0d8c55253df31ddba567f427e8de" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.605273 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc2911a64e9cfb6608e61a63851d398f3d0d0d8c55253df31ddba567f427e8de"} err="failed to get container status \"bc2911a64e9cfb6608e61a63851d398f3d0d0d8c55253df31ddba567f427e8de\": rpc error: code = NotFound desc = could not find container \"bc2911a64e9cfb6608e61a63851d398f3d0d0d8c55253df31ddba567f427e8de\": container with ID starting with bc2911a64e9cfb6608e61a63851d398f3d0d0d8c55253df31ddba567f427e8de not found: ID does not exist" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.605291 4952 scope.go:117] "RemoveContainer" containerID="26d422bb25945cef6dbb561acd23faacf28d498c7759ecdb7c5a48fb8ee05580" Nov 22 03:13:21 crc kubenswrapper[4952]: E1122 03:13:21.605915 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d422bb25945cef6dbb561acd23faacf28d498c7759ecdb7c5a48fb8ee05580\": container with ID starting with 26d422bb25945cef6dbb561acd23faacf28d498c7759ecdb7c5a48fb8ee05580 not found: ID does not exist" containerID="26d422bb25945cef6dbb561acd23faacf28d498c7759ecdb7c5a48fb8ee05580" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.605945 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d422bb25945cef6dbb561acd23faacf28d498c7759ecdb7c5a48fb8ee05580"} err="failed to get container status \"26d422bb25945cef6dbb561acd23faacf28d498c7759ecdb7c5a48fb8ee05580\": rpc error: code = NotFound desc = could not find container \"26d422bb25945cef6dbb561acd23faacf28d498c7759ecdb7c5a48fb8ee05580\": container with ID starting with 26d422bb25945cef6dbb561acd23faacf28d498c7759ecdb7c5a48fb8ee05580 not found: ID does not exist" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.605965 4952 scope.go:117] "RemoveContainer" containerID="95d1b30f7a155c3caff982d7a350e90763a124347b862743eabcfdb8cf6213bb" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.620363 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:13:21 crc kubenswrapper[4952]: E1122 03:13:21.620987 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerName="ceilometer-notification-agent" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.621032 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerName="ceilometer-notification-agent" Nov 22 03:13:21 crc kubenswrapper[4952]: E1122 03:13:21.621064 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerName="ceilometer-central-agent" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.621074 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerName="ceilometer-central-agent" Nov 22 03:13:21 crc kubenswrapper[4952]: E1122 03:13:21.621086 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5" containerName="nova-api-log" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.621095 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5" containerName="nova-api-log" Nov 22 03:13:21 crc kubenswrapper[4952]: E1122 03:13:21.621112 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5" containerName="nova-api-api" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.621120 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5" containerName="nova-api-api" Nov 22 03:13:21 crc kubenswrapper[4952]: E1122 03:13:21.621146 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerName="proxy-httpd" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.621157 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerName="proxy-httpd" Nov 22 03:13:21 crc kubenswrapper[4952]: E1122 03:13:21.621183 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerName="sg-core" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.621191 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerName="sg-core" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.621430 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerName="proxy-httpd" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.621456 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerName="ceilometer-central-agent" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.621468 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5" containerName="nova-api-api" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.621482 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerName="sg-core" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.621494 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5" containerName="nova-api-log" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.621506 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd2de5e-879f-48ce-86d9-175baea81ab6" containerName="ceilometer-notification-agent" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.624165 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.630935 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.631234 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.640665 4952 scope.go:117] "RemoveContainer" containerID="a342758114eefc255574039f5243d4a550d8de532e043f47c38a578d61008974" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.641033 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.641303 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.647662 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qdvb\" (UniqueName: \"kubernetes.io/projected/50403f1a-ba57-4c1f-86a5-67269195ca65-kube-api-access-6qdvb\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.647751 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50403f1a-ba57-4c1f-86a5-67269195ca65-run-httpd\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.647878 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.648038 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-scripts\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.648152 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-config-data\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.648200 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50403f1a-ba57-4c1f-86a5-67269195ca65-log-httpd\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.648275 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.648308 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.651793 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.654237 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.662466 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.669126 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.669206 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.669592 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.748871 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.748928 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.748955 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-config-data\") pod \"nova-api-0\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " pod="openstack/nova-api-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.748988 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " pod="openstack/nova-api-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.749005 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-logs\") pod \"nova-api-0\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " pod="openstack/nova-api-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.749039 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qdvb\" (UniqueName: \"kubernetes.io/projected/50403f1a-ba57-4c1f-86a5-67269195ca65-kube-api-access-6qdvb\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.749063 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50403f1a-ba57-4c1f-86a5-67269195ca65-run-httpd\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.749123 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.749144 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-public-tls-certs\") pod \"nova-api-0\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " pod="openstack/nova-api-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.749179 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-scripts\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.749203 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4wlf\" (UniqueName: \"kubernetes.io/projected/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-kube-api-access-g4wlf\") pod \"nova-api-0\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " pod="openstack/nova-api-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.749233 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-config-data\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.749253 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " pod="openstack/nova-api-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.749270 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50403f1a-ba57-4c1f-86a5-67269195ca65-log-httpd\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.749688 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50403f1a-ba57-4c1f-86a5-67269195ca65-log-httpd\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.751556 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50403f1a-ba57-4c1f-86a5-67269195ca65-run-httpd\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.753507 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.754566 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.755373 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-scripts\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.757300 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.758239 4952 scope.go:117] "RemoveContainer" containerID="95d1b30f7a155c3caff982d7a350e90763a124347b862743eabcfdb8cf6213bb" Nov 22 03:13:21 crc kubenswrapper[4952]: E1122 03:13:21.759070 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95d1b30f7a155c3caff982d7a350e90763a124347b862743eabcfdb8cf6213bb\": container with ID starting with 95d1b30f7a155c3caff982d7a350e90763a124347b862743eabcfdb8cf6213bb not found: ID does not exist" containerID="95d1b30f7a155c3caff982d7a350e90763a124347b862743eabcfdb8cf6213bb" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.759124 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95d1b30f7a155c3caff982d7a350e90763a124347b862743eabcfdb8cf6213bb"} err="failed to get container status \"95d1b30f7a155c3caff982d7a350e90763a124347b862743eabcfdb8cf6213bb\": rpc error: code = NotFound desc = could not find container \"95d1b30f7a155c3caff982d7a350e90763a124347b862743eabcfdb8cf6213bb\": container with ID starting with 95d1b30f7a155c3caff982d7a350e90763a124347b862743eabcfdb8cf6213bb not found: ID does not exist" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.759158 4952 scope.go:117] "RemoveContainer" containerID="a342758114eefc255574039f5243d4a550d8de532e043f47c38a578d61008974" Nov 22 03:13:21 crc kubenswrapper[4952]: E1122 03:13:21.759884 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a342758114eefc255574039f5243d4a550d8de532e043f47c38a578d61008974\": container with ID starting with a342758114eefc255574039f5243d4a550d8de532e043f47c38a578d61008974 not found: ID does not exist" containerID="a342758114eefc255574039f5243d4a550d8de532e043f47c38a578d61008974" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.759917 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a342758114eefc255574039f5243d4a550d8de532e043f47c38a578d61008974"} err="failed to get container status \"a342758114eefc255574039f5243d4a550d8de532e043f47c38a578d61008974\": rpc error: code = NotFound desc = could not find container \"a342758114eefc255574039f5243d4a550d8de532e043f47c38a578d61008974\": container with ID starting with a342758114eefc255574039f5243d4a550d8de532e043f47c38a578d61008974 not found: ID does not exist" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.761503 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-config-data\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.774189 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qdvb\" (UniqueName: \"kubernetes.io/projected/50403f1a-ba57-4c1f-86a5-67269195ca65-kube-api-access-6qdvb\") pod \"ceilometer-0\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " pod="openstack/ceilometer-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.851577 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-public-tls-certs\") pod \"nova-api-0\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " pod="openstack/nova-api-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.851679 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4wlf\" (UniqueName: \"kubernetes.io/projected/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-kube-api-access-g4wlf\") pod \"nova-api-0\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " pod="openstack/nova-api-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.851734 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " pod="openstack/nova-api-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.851789 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-config-data\") pod \"nova-api-0\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " pod="openstack/nova-api-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.851836 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-logs\") pod \"nova-api-0\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " pod="openstack/nova-api-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.851856 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " pod="openstack/nova-api-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.853425 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-logs\") pod \"nova-api-0\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " pod="openstack/nova-api-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.855866 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " pod="openstack/nova-api-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.856114 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-public-tls-certs\") pod \"nova-api-0\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " pod="openstack/nova-api-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.856484 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " pod="openstack/nova-api-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.861790 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-config-data\") pod \"nova-api-0\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " pod="openstack/nova-api-0" Nov 22 03:13:21 crc kubenswrapper[4952]: I1122 03:13:21.870856 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4wlf\" (UniqueName: \"kubernetes.io/projected/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-kube-api-access-g4wlf\") pod \"nova-api-0\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " pod="openstack/nova-api-0" Nov 22 03:13:22 crc kubenswrapper[4952]: I1122 03:13:22.041944 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:13:22 crc kubenswrapper[4952]: I1122 03:13:22.068629 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:22 crc kubenswrapper[4952]: I1122 03:13:22.091535 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:22 crc kubenswrapper[4952]: I1122 03:13:22.159698 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:13:22 crc kubenswrapper[4952]: I1122 03:13:22.491835 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:13:22 crc kubenswrapper[4952]: I1122 03:13:22.557810 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fd2de5e-879f-48ce-86d9-175baea81ab6" path="/var/lib/kubelet/pods/8fd2de5e-879f-48ce-86d9-175baea81ab6/volumes" Nov 22 03:13:22 crc kubenswrapper[4952]: I1122 03:13:22.558842 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5" path="/var/lib/kubelet/pods/a4d84fa1-deff-4ccd-bdfd-12b6e3bfcaf5/volumes" Nov 22 03:13:22 crc kubenswrapper[4952]: I1122 03:13:22.570691 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:13:22 crc kubenswrapper[4952]: W1122 03:13:22.581607 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50403f1a_ba57_4c1f_86a5_67269195ca65.slice/crio-360316b03b4cdbb0ad047efda942b4766bb45aa6b6e50c5f803542cf827b86a4 WatchSource:0}: Error finding container 360316b03b4cdbb0ad047efda942b4766bb45aa6b6e50c5f803542cf827b86a4: Status 404 returned error can't find the container with id 360316b03b4cdbb0ad047efda942b4766bb45aa6b6e50c5f803542cf827b86a4 Nov 22 03:13:22 crc kubenswrapper[4952]: I1122 03:13:22.681875 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:13:22 crc kubenswrapper[4952]: W1122 03:13:22.687067 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d01c1aa_13e7_4bab_bfc1_cf99d0c99629.slice/crio-d4fbb1233e5c0b9a6caf9817b03e310aa64cb8e39d8bcdd4bf6ee69ea6c404fd WatchSource:0}: Error finding container d4fbb1233e5c0b9a6caf9817b03e310aa64cb8e39d8bcdd4bf6ee69ea6c404fd: Status 404 returned error can't find the container with id d4fbb1233e5c0b9a6caf9817b03e310aa64cb8e39d8bcdd4bf6ee69ea6c404fd Nov 22 03:13:22 crc kubenswrapper[4952]: I1122 03:13:22.816723 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-l4xwk"] Nov 22 03:13:22 crc kubenswrapper[4952]: I1122 03:13:22.818587 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-l4xwk" Nov 22 03:13:22 crc kubenswrapper[4952]: I1122 03:13:22.828784 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 22 03:13:22 crc kubenswrapper[4952]: I1122 03:13:22.829001 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 22 03:13:22 crc kubenswrapper[4952]: I1122 03:13:22.839526 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-l4xwk"] Nov 22 03:13:22 crc kubenswrapper[4952]: I1122 03:13:22.974607 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04233ec3-e35d-4cb2-959d-2fad451655d2-scripts\") pod \"nova-cell1-cell-mapping-l4xwk\" (UID: \"04233ec3-e35d-4cb2-959d-2fad451655d2\") " pod="openstack/nova-cell1-cell-mapping-l4xwk" Nov 22 03:13:22 crc kubenswrapper[4952]: I1122 03:13:22.974703 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04233ec3-e35d-4cb2-959d-2fad451655d2-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-l4xwk\" (UID: \"04233ec3-e35d-4cb2-959d-2fad451655d2\") " pod="openstack/nova-cell1-cell-mapping-l4xwk" Nov 22 03:13:22 crc kubenswrapper[4952]: I1122 03:13:22.974820 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04233ec3-e35d-4cb2-959d-2fad451655d2-config-data\") pod \"nova-cell1-cell-mapping-l4xwk\" (UID: \"04233ec3-e35d-4cb2-959d-2fad451655d2\") " pod="openstack/nova-cell1-cell-mapping-l4xwk" Nov 22 03:13:22 crc kubenswrapper[4952]: I1122 03:13:22.974983 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcl8x\" (UniqueName: \"kubernetes.io/projected/04233ec3-e35d-4cb2-959d-2fad451655d2-kube-api-access-tcl8x\") pod \"nova-cell1-cell-mapping-l4xwk\" (UID: \"04233ec3-e35d-4cb2-959d-2fad451655d2\") " pod="openstack/nova-cell1-cell-mapping-l4xwk" Nov 22 03:13:23 crc kubenswrapper[4952]: I1122 03:13:23.077526 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04233ec3-e35d-4cb2-959d-2fad451655d2-scripts\") pod \"nova-cell1-cell-mapping-l4xwk\" (UID: \"04233ec3-e35d-4cb2-959d-2fad451655d2\") " pod="openstack/nova-cell1-cell-mapping-l4xwk" Nov 22 03:13:23 crc kubenswrapper[4952]: I1122 03:13:23.077616 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04233ec3-e35d-4cb2-959d-2fad451655d2-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-l4xwk\" (UID: \"04233ec3-e35d-4cb2-959d-2fad451655d2\") " pod="openstack/nova-cell1-cell-mapping-l4xwk" Nov 22 03:13:23 crc kubenswrapper[4952]: I1122 03:13:23.077687 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04233ec3-e35d-4cb2-959d-2fad451655d2-config-data\") pod \"nova-cell1-cell-mapping-l4xwk\" (UID: \"04233ec3-e35d-4cb2-959d-2fad451655d2\") " pod="openstack/nova-cell1-cell-mapping-l4xwk" Nov 22 03:13:23 crc kubenswrapper[4952]: I1122 03:13:23.077830 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcl8x\" (UniqueName: \"kubernetes.io/projected/04233ec3-e35d-4cb2-959d-2fad451655d2-kube-api-access-tcl8x\") pod \"nova-cell1-cell-mapping-l4xwk\" (UID: \"04233ec3-e35d-4cb2-959d-2fad451655d2\") " pod="openstack/nova-cell1-cell-mapping-l4xwk" Nov 22 03:13:23 crc kubenswrapper[4952]: I1122 03:13:23.084736 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04233ec3-e35d-4cb2-959d-2fad451655d2-scripts\") pod \"nova-cell1-cell-mapping-l4xwk\" (UID: \"04233ec3-e35d-4cb2-959d-2fad451655d2\") " pod="openstack/nova-cell1-cell-mapping-l4xwk" Nov 22 03:13:23 crc kubenswrapper[4952]: I1122 03:13:23.084994 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04233ec3-e35d-4cb2-959d-2fad451655d2-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-l4xwk\" (UID: \"04233ec3-e35d-4cb2-959d-2fad451655d2\") " pod="openstack/nova-cell1-cell-mapping-l4xwk" Nov 22 03:13:23 crc kubenswrapper[4952]: I1122 03:13:23.086431 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04233ec3-e35d-4cb2-959d-2fad451655d2-config-data\") pod \"nova-cell1-cell-mapping-l4xwk\" (UID: \"04233ec3-e35d-4cb2-959d-2fad451655d2\") " pod="openstack/nova-cell1-cell-mapping-l4xwk" Nov 22 03:13:23 crc kubenswrapper[4952]: I1122 03:13:23.095964 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcl8x\" (UniqueName: \"kubernetes.io/projected/04233ec3-e35d-4cb2-959d-2fad451655d2-kube-api-access-tcl8x\") pod \"nova-cell1-cell-mapping-l4xwk\" (UID: \"04233ec3-e35d-4cb2-959d-2fad451655d2\") " pod="openstack/nova-cell1-cell-mapping-l4xwk" Nov 22 03:13:23 crc kubenswrapper[4952]: I1122 03:13:23.173825 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-l4xwk" Nov 22 03:13:23 crc kubenswrapper[4952]: I1122 03:13:23.490751 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629","Type":"ContainerStarted","Data":"736bab4512c8d69a580698ea5780ace6a0b015b183636a940eb865391b295209"} Nov 22 03:13:23 crc kubenswrapper[4952]: I1122 03:13:23.490810 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629","Type":"ContainerStarted","Data":"07166c25dc216572bd6890b512a4a0240020f5e02f36b7887949de8071d6c9aa"} Nov 22 03:13:23 crc kubenswrapper[4952]: I1122 03:13:23.490822 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629","Type":"ContainerStarted","Data":"d4fbb1233e5c0b9a6caf9817b03e310aa64cb8e39d8bcdd4bf6ee69ea6c404fd"} Nov 22 03:13:23 crc kubenswrapper[4952]: I1122 03:13:23.498771 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50403f1a-ba57-4c1f-86a5-67269195ca65","Type":"ContainerStarted","Data":"ffcb9e1580e370ee0cb31f4463f1da04ab31cfd53007ae164da8e3026ec9e621"} Nov 22 03:13:23 crc kubenswrapper[4952]: I1122 03:13:23.498848 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50403f1a-ba57-4c1f-86a5-67269195ca65","Type":"ContainerStarted","Data":"360316b03b4cdbb0ad047efda942b4766bb45aa6b6e50c5f803542cf827b86a4"} Nov 22 03:13:23 crc kubenswrapper[4952]: I1122 03:13:23.726515 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.726485147 podStartE2EDuration="2.726485147s" podCreationTimestamp="2025-11-22 03:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:13:23.523009081 +0000 UTC m=+1167.829026344" watchObservedRunningTime="2025-11-22 03:13:23.726485147 +0000 UTC m=+1168.032502420" Nov 22 03:13:23 crc kubenswrapper[4952]: I1122 03:13:23.730393 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-l4xwk"] Nov 22 03:13:24 crc kubenswrapper[4952]: I1122 03:13:24.512006 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-l4xwk" event={"ID":"04233ec3-e35d-4cb2-959d-2fad451655d2","Type":"ContainerStarted","Data":"7aa8ed29b36a3bae662ab78715137eaf060b736a0f6a1049dc7146ab55e8667a"} Nov 22 03:13:24 crc kubenswrapper[4952]: I1122 03:13:24.512459 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-l4xwk" event={"ID":"04233ec3-e35d-4cb2-959d-2fad451655d2","Type":"ContainerStarted","Data":"d19cf967d318e8f475699be77ebd4639ace3144669a5e625ba265eef540797ee"} Nov 22 03:13:24 crc kubenswrapper[4952]: I1122 03:13:24.516528 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50403f1a-ba57-4c1f-86a5-67269195ca65","Type":"ContainerStarted","Data":"fadc909b8429c89e3bd89b21d26568ab55aaa449f6180242f3547974616e3131"} Nov 22 03:13:24 crc kubenswrapper[4952]: I1122 03:13:24.538298 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-l4xwk" podStartSLOduration=2.538267473 podStartE2EDuration="2.538267473s" podCreationTimestamp="2025-11-22 03:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:13:24.528347688 +0000 UTC m=+1168.834364971" watchObservedRunningTime="2025-11-22 03:13:24.538267473 +0000 UTC m=+1168.844284756" Nov 22 03:13:25 crc kubenswrapper[4952]: I1122 03:13:25.205764 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:13:25 crc kubenswrapper[4952]: I1122 03:13:25.268037 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-pqrg5"] Nov 22 03:13:25 crc kubenswrapper[4952]: I1122 03:13:25.268737 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" podUID="3c23dea9-e99e-4527-902f-dc7280730cd3" containerName="dnsmasq-dns" containerID="cri-o://bc789f8d4149663f25d6cca074074343379003da58f29ba1e734769ce89e73b5" gracePeriod=10 Nov 22 03:13:25 crc kubenswrapper[4952]: I1122 03:13:25.553958 4952 generic.go:334] "Generic (PLEG): container finished" podID="3c23dea9-e99e-4527-902f-dc7280730cd3" containerID="bc789f8d4149663f25d6cca074074343379003da58f29ba1e734769ce89e73b5" exitCode=0 Nov 22 03:13:25 crc kubenswrapper[4952]: I1122 03:13:25.554029 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" event={"ID":"3c23dea9-e99e-4527-902f-dc7280730cd3","Type":"ContainerDied","Data":"bc789f8d4149663f25d6cca074074343379003da58f29ba1e734769ce89e73b5"} Nov 22 03:13:25 crc kubenswrapper[4952]: I1122 03:13:25.559665 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50403f1a-ba57-4c1f-86a5-67269195ca65","Type":"ContainerStarted","Data":"803b394efea0353c6dcc4b5227756459a0f676ad763712060246045d990e8ad4"} Nov 22 03:13:25 crc kubenswrapper[4952]: I1122 03:13:25.599427 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" podUID="3c23dea9-e99e-4527-902f-dc7280730cd3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.173:5353: connect: connection refused" Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.333582 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.461784 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v27q\" (UniqueName: \"kubernetes.io/projected/3c23dea9-e99e-4527-902f-dc7280730cd3-kube-api-access-9v27q\") pod \"3c23dea9-e99e-4527-902f-dc7280730cd3\" (UID: \"3c23dea9-e99e-4527-902f-dc7280730cd3\") " Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.461847 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-ovsdbserver-nb\") pod \"3c23dea9-e99e-4527-902f-dc7280730cd3\" (UID: \"3c23dea9-e99e-4527-902f-dc7280730cd3\") " Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.462166 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-dns-svc\") pod \"3c23dea9-e99e-4527-902f-dc7280730cd3\" (UID: \"3c23dea9-e99e-4527-902f-dc7280730cd3\") " Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.462220 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-config\") pod \"3c23dea9-e99e-4527-902f-dc7280730cd3\" (UID: \"3c23dea9-e99e-4527-902f-dc7280730cd3\") " Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.462295 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-ovsdbserver-sb\") pod \"3c23dea9-e99e-4527-902f-dc7280730cd3\" (UID: \"3c23dea9-e99e-4527-902f-dc7280730cd3\") " Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.469986 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c23dea9-e99e-4527-902f-dc7280730cd3-kube-api-access-9v27q" (OuterVolumeSpecName: "kube-api-access-9v27q") pod "3c23dea9-e99e-4527-902f-dc7280730cd3" (UID: "3c23dea9-e99e-4527-902f-dc7280730cd3"). InnerVolumeSpecName "kube-api-access-9v27q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.531265 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3c23dea9-e99e-4527-902f-dc7280730cd3" (UID: "3c23dea9-e99e-4527-902f-dc7280730cd3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.567601 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v27q\" (UniqueName: \"kubernetes.io/projected/3c23dea9-e99e-4527-902f-dc7280730cd3-kube-api-access-9v27q\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.567697 4952 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.570005 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3c23dea9-e99e-4527-902f-dc7280730cd3" (UID: "3c23dea9-e99e-4527-902f-dc7280730cd3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.570227 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3c23dea9-e99e-4527-902f-dc7280730cd3" (UID: "3c23dea9-e99e-4527-902f-dc7280730cd3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.582131 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-config" (OuterVolumeSpecName: "config") pod "3c23dea9-e99e-4527-902f-dc7280730cd3" (UID: "3c23dea9-e99e-4527-902f-dc7280730cd3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.583754 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.649869 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-pqrg5" event={"ID":"3c23dea9-e99e-4527-902f-dc7280730cd3","Type":"ContainerDied","Data":"8fceefad12c3f40f24a9d016d7ae82029f1855b6f0fcb2e08894c3ee48ec74fe"} Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.656492 4952 scope.go:117] "RemoveContainer" containerID="bc789f8d4149663f25d6cca074074343379003da58f29ba1e734769ce89e73b5" Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.672069 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.672114 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.672125 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c23dea9-e99e-4527-902f-dc7280730cd3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.688361 4952 scope.go:117] "RemoveContainer" containerID="22195f74a71bb4825a9ce31871f9a3ae45278d1ac01a28f2cfb8709d0240cbbc" Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.690852 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-pqrg5"] Nov 22 03:13:26 crc kubenswrapper[4952]: I1122 03:13:26.702816 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-pqrg5"] Nov 22 03:13:27 crc kubenswrapper[4952]: I1122 03:13:27.596981 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50403f1a-ba57-4c1f-86a5-67269195ca65","Type":"ContainerStarted","Data":"c0b04bebd9ed531aec28fd430adc6d44e23000a227a75e35465b6c1c4c47b4a9"} Nov 22 03:13:27 crc kubenswrapper[4952]: I1122 03:13:27.598465 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 03:13:27 crc kubenswrapper[4952]: I1122 03:13:27.637884 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.510524227 podStartE2EDuration="6.637859505s" podCreationTimestamp="2025-11-22 03:13:21 +0000 UTC" firstStartedPulling="2025-11-22 03:13:22.588620375 +0000 UTC m=+1166.894637638" lastFinishedPulling="2025-11-22 03:13:26.715955643 +0000 UTC m=+1171.021972916" observedRunningTime="2025-11-22 03:13:27.621454087 +0000 UTC m=+1171.927471370" watchObservedRunningTime="2025-11-22 03:13:27.637859505 +0000 UTC m=+1171.943876788" Nov 22 03:13:28 crc kubenswrapper[4952]: I1122 03:13:28.563604 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c23dea9-e99e-4527-902f-dc7280730cd3" path="/var/lib/kubelet/pods/3c23dea9-e99e-4527-902f-dc7280730cd3/volumes" Nov 22 03:13:30 crc kubenswrapper[4952]: I1122 03:13:30.665845 4952 generic.go:334] "Generic (PLEG): container finished" podID="04233ec3-e35d-4cb2-959d-2fad451655d2" containerID="7aa8ed29b36a3bae662ab78715137eaf060b736a0f6a1049dc7146ab55e8667a" exitCode=0 Nov 22 03:13:30 crc kubenswrapper[4952]: I1122 03:13:30.665961 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-l4xwk" event={"ID":"04233ec3-e35d-4cb2-959d-2fad451655d2","Type":"ContainerDied","Data":"7aa8ed29b36a3bae662ab78715137eaf060b736a0f6a1049dc7146ab55e8667a"} Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.034999 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-l4xwk" Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.161018 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.161102 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.218409 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04233ec3-e35d-4cb2-959d-2fad451655d2-config-data\") pod \"04233ec3-e35d-4cb2-959d-2fad451655d2\" (UID: \"04233ec3-e35d-4cb2-959d-2fad451655d2\") " Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.218959 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04233ec3-e35d-4cb2-959d-2fad451655d2-scripts\") pod \"04233ec3-e35d-4cb2-959d-2fad451655d2\" (UID: \"04233ec3-e35d-4cb2-959d-2fad451655d2\") " Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.219187 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04233ec3-e35d-4cb2-959d-2fad451655d2-combined-ca-bundle\") pod \"04233ec3-e35d-4cb2-959d-2fad451655d2\" (UID: \"04233ec3-e35d-4cb2-959d-2fad451655d2\") " Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.219535 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcl8x\" (UniqueName: \"kubernetes.io/projected/04233ec3-e35d-4cb2-959d-2fad451655d2-kube-api-access-tcl8x\") pod \"04233ec3-e35d-4cb2-959d-2fad451655d2\" (UID: \"04233ec3-e35d-4cb2-959d-2fad451655d2\") " Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.240946 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04233ec3-e35d-4cb2-959d-2fad451655d2-kube-api-access-tcl8x" (OuterVolumeSpecName: "kube-api-access-tcl8x") pod "04233ec3-e35d-4cb2-959d-2fad451655d2" (UID: "04233ec3-e35d-4cb2-959d-2fad451655d2"). InnerVolumeSpecName "kube-api-access-tcl8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.241106 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04233ec3-e35d-4cb2-959d-2fad451655d2-scripts" (OuterVolumeSpecName: "scripts") pod "04233ec3-e35d-4cb2-959d-2fad451655d2" (UID: "04233ec3-e35d-4cb2-959d-2fad451655d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.249208 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04233ec3-e35d-4cb2-959d-2fad451655d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04233ec3-e35d-4cb2-959d-2fad451655d2" (UID: "04233ec3-e35d-4cb2-959d-2fad451655d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.250612 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04233ec3-e35d-4cb2-959d-2fad451655d2-config-data" (OuterVolumeSpecName: "config-data") pod "04233ec3-e35d-4cb2-959d-2fad451655d2" (UID: "04233ec3-e35d-4cb2-959d-2fad451655d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.322481 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04233ec3-e35d-4cb2-959d-2fad451655d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.322529 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcl8x\" (UniqueName: \"kubernetes.io/projected/04233ec3-e35d-4cb2-959d-2fad451655d2-kube-api-access-tcl8x\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.322560 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04233ec3-e35d-4cb2-959d-2fad451655d2-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.322573 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04233ec3-e35d-4cb2-959d-2fad451655d2-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.687585 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-l4xwk" event={"ID":"04233ec3-e35d-4cb2-959d-2fad451655d2","Type":"ContainerDied","Data":"d19cf967d318e8f475699be77ebd4639ace3144669a5e625ba265eef540797ee"} Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.688095 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d19cf967d318e8f475699be77ebd4639ace3144669a5e625ba265eef540797ee" Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.687696 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-l4xwk" Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.979874 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.980205 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1d01c1aa-13e7-4bab-bfc1-cf99d0c99629" containerName="nova-api-log" containerID="cri-o://07166c25dc216572bd6890b512a4a0240020f5e02f36b7887949de8071d6c9aa" gracePeriod=30 Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.980404 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1d01c1aa-13e7-4bab-bfc1-cf99d0c99629" containerName="nova-api-api" containerID="cri-o://736bab4512c8d69a580698ea5780ace6a0b015b183636a940eb865391b295209" gracePeriod=30 Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.986410 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1d01c1aa-13e7-4bab-bfc1-cf99d0c99629" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.183:8774/\": EOF" Nov 22 03:13:32 crc kubenswrapper[4952]: I1122 03:13:32.986647 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1d01c1aa-13e7-4bab-bfc1-cf99d0c99629" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.183:8774/\": EOF" Nov 22 03:13:33 crc kubenswrapper[4952]: I1122 03:13:33.072104 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:13:33 crc kubenswrapper[4952]: I1122 03:13:33.072720 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="911e63ba-b5de-4e5f-80e9-d62822cb8bac" containerName="nova-metadata-metadata" containerID="cri-o://b7d7af8fd776c334b8aa5bc0f9d713749d822981b734fd714e856772dd46f4f4" gracePeriod=30 Nov 22 03:13:33 crc kubenswrapper[4952]: I1122 03:13:33.073063 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="911e63ba-b5de-4e5f-80e9-d62822cb8bac" containerName="nova-metadata-log" containerID="cri-o://f50e7157212607d3d7d4da489f11a3aea8a2cb79c9bdd09955d6d86f9ad62156" gracePeriod=30 Nov 22 03:13:33 crc kubenswrapper[4952]: I1122 03:13:33.101657 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:13:33 crc kubenswrapper[4952]: I1122 03:13:33.101921 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a7378007-7521-4913-bfba-431de1bc6b02" containerName="nova-scheduler-scheduler" containerID="cri-o://ca993581dc2188e48b1ef3447f29477af021cff4789c4ef9cb16ebf9eb215905" gracePeriod=30 Nov 22 03:13:33 crc kubenswrapper[4952]: E1122 03:13:33.495597 4952 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca993581dc2188e48b1ef3447f29477af021cff4789c4ef9cb16ebf9eb215905" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 03:13:33 crc kubenswrapper[4952]: E1122 03:13:33.519136 4952 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca993581dc2188e48b1ef3447f29477af021cff4789c4ef9cb16ebf9eb215905" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 03:13:33 crc kubenswrapper[4952]: E1122 03:13:33.534061 4952 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca993581dc2188e48b1ef3447f29477af021cff4789c4ef9cb16ebf9eb215905" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 03:13:33 crc kubenswrapper[4952]: E1122 03:13:33.534133 4952 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a7378007-7521-4913-bfba-431de1bc6b02" containerName="nova-scheduler-scheduler" Nov 22 03:13:33 crc kubenswrapper[4952]: I1122 03:13:33.698793 4952 generic.go:334] "Generic (PLEG): container finished" podID="1d01c1aa-13e7-4bab-bfc1-cf99d0c99629" containerID="07166c25dc216572bd6890b512a4a0240020f5e02f36b7887949de8071d6c9aa" exitCode=143 Nov 22 03:13:33 crc kubenswrapper[4952]: I1122 03:13:33.698891 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629","Type":"ContainerDied","Data":"07166c25dc216572bd6890b512a4a0240020f5e02f36b7887949de8071d6c9aa"} Nov 22 03:13:33 crc kubenswrapper[4952]: I1122 03:13:33.700423 4952 generic.go:334] "Generic (PLEG): container finished" podID="911e63ba-b5de-4e5f-80e9-d62822cb8bac" containerID="f50e7157212607d3d7d4da489f11a3aea8a2cb79c9bdd09955d6d86f9ad62156" exitCode=143 Nov 22 03:13:33 crc kubenswrapper[4952]: I1122 03:13:33.700461 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"911e63ba-b5de-4e5f-80e9-d62822cb8bac","Type":"ContainerDied","Data":"f50e7157212607d3d7d4da489f11a3aea8a2cb79c9bdd09955d6d86f9ad62156"} Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.240667 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="911e63ba-b5de-4e5f-80e9-d62822cb8bac" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": read tcp 10.217.0.2:57332->10.217.0.177:8775: read: connection reset by peer" Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.240778 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="911e63ba-b5de-4e5f-80e9-d62822cb8bac" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": read tcp 10.217.0.2:57348->10.217.0.177:8775: read: connection reset by peer" Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.692914 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.755129 4952 generic.go:334] "Generic (PLEG): container finished" podID="911e63ba-b5de-4e5f-80e9-d62822cb8bac" containerID="b7d7af8fd776c334b8aa5bc0f9d713749d822981b734fd714e856772dd46f4f4" exitCode=0 Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.755202 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"911e63ba-b5de-4e5f-80e9-d62822cb8bac","Type":"ContainerDied","Data":"b7d7af8fd776c334b8aa5bc0f9d713749d822981b734fd714e856772dd46f4f4"} Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.755258 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"911e63ba-b5de-4e5f-80e9-d62822cb8bac","Type":"ContainerDied","Data":"de9563ac5f45702da09c74c63566ec9969ac179e252c06056b889a4fd41f220b"} Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.755264 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.755292 4952 scope.go:117] "RemoveContainer" containerID="b7d7af8fd776c334b8aa5bc0f9d713749d822981b734fd714e856772dd46f4f4" Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.795713 4952 scope.go:117] "RemoveContainer" containerID="f50e7157212607d3d7d4da489f11a3aea8a2cb79c9bdd09955d6d86f9ad62156" Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.818034 4952 scope.go:117] "RemoveContainer" containerID="b7d7af8fd776c334b8aa5bc0f9d713749d822981b734fd714e856772dd46f4f4" Nov 22 03:13:36 crc kubenswrapper[4952]: E1122 03:13:36.819026 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7d7af8fd776c334b8aa5bc0f9d713749d822981b734fd714e856772dd46f4f4\": container with ID starting with b7d7af8fd776c334b8aa5bc0f9d713749d822981b734fd714e856772dd46f4f4 not found: ID does not exist" containerID="b7d7af8fd776c334b8aa5bc0f9d713749d822981b734fd714e856772dd46f4f4" Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.819064 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d7af8fd776c334b8aa5bc0f9d713749d822981b734fd714e856772dd46f4f4"} err="failed to get container status \"b7d7af8fd776c334b8aa5bc0f9d713749d822981b734fd714e856772dd46f4f4\": rpc error: code = NotFound desc = could not find container \"b7d7af8fd776c334b8aa5bc0f9d713749d822981b734fd714e856772dd46f4f4\": container with ID starting with b7d7af8fd776c334b8aa5bc0f9d713749d822981b734fd714e856772dd46f4f4 not found: ID does not exist" Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.819093 4952 scope.go:117] "RemoveContainer" containerID="f50e7157212607d3d7d4da489f11a3aea8a2cb79c9bdd09955d6d86f9ad62156" Nov 22 03:13:36 crc kubenswrapper[4952]: E1122 03:13:36.819304 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f50e7157212607d3d7d4da489f11a3aea8a2cb79c9bdd09955d6d86f9ad62156\": container with ID starting with f50e7157212607d3d7d4da489f11a3aea8a2cb79c9bdd09955d6d86f9ad62156 not found: ID does not exist" containerID="f50e7157212607d3d7d4da489f11a3aea8a2cb79c9bdd09955d6d86f9ad62156" Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.819326 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f50e7157212607d3d7d4da489f11a3aea8a2cb79c9bdd09955d6d86f9ad62156"} err="failed to get container status \"f50e7157212607d3d7d4da489f11a3aea8a2cb79c9bdd09955d6d86f9ad62156\": rpc error: code = NotFound desc = could not find container \"f50e7157212607d3d7d4da489f11a3aea8a2cb79c9bdd09955d6d86f9ad62156\": container with ID starting with f50e7157212607d3d7d4da489f11a3aea8a2cb79c9bdd09955d6d86f9ad62156 not found: ID does not exist" Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.824683 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/911e63ba-b5de-4e5f-80e9-d62822cb8bac-config-data\") pod \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\" (UID: \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\") " Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.824782 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911e63ba-b5de-4e5f-80e9-d62822cb8bac-combined-ca-bundle\") pod \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\" (UID: \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\") " Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.824822 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqcpl\" (UniqueName: \"kubernetes.io/projected/911e63ba-b5de-4e5f-80e9-d62822cb8bac-kube-api-access-dqcpl\") pod \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\" (UID: \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\") " Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.824991 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/911e63ba-b5de-4e5f-80e9-d62822cb8bac-logs\") pod \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\" (UID: \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\") " Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.825170 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/911e63ba-b5de-4e5f-80e9-d62822cb8bac-nova-metadata-tls-certs\") pod \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\" (UID: \"911e63ba-b5de-4e5f-80e9-d62822cb8bac\") " Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.826281 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/911e63ba-b5de-4e5f-80e9-d62822cb8bac-logs" (OuterVolumeSpecName: "logs") pod "911e63ba-b5de-4e5f-80e9-d62822cb8bac" (UID: "911e63ba-b5de-4e5f-80e9-d62822cb8bac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.834953 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/911e63ba-b5de-4e5f-80e9-d62822cb8bac-kube-api-access-dqcpl" (OuterVolumeSpecName: "kube-api-access-dqcpl") pod "911e63ba-b5de-4e5f-80e9-d62822cb8bac" (UID: "911e63ba-b5de-4e5f-80e9-d62822cb8bac"). InnerVolumeSpecName "kube-api-access-dqcpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.860785 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/911e63ba-b5de-4e5f-80e9-d62822cb8bac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "911e63ba-b5de-4e5f-80e9-d62822cb8bac" (UID: "911e63ba-b5de-4e5f-80e9-d62822cb8bac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.868050 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/911e63ba-b5de-4e5f-80e9-d62822cb8bac-config-data" (OuterVolumeSpecName: "config-data") pod "911e63ba-b5de-4e5f-80e9-d62822cb8bac" (UID: "911e63ba-b5de-4e5f-80e9-d62822cb8bac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.907628 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/911e63ba-b5de-4e5f-80e9-d62822cb8bac-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "911e63ba-b5de-4e5f-80e9-d62822cb8bac" (UID: "911e63ba-b5de-4e5f-80e9-d62822cb8bac"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.926954 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/911e63ba-b5de-4e5f-80e9-d62822cb8bac-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.926985 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911e63ba-b5de-4e5f-80e9-d62822cb8bac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.926999 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqcpl\" (UniqueName: \"kubernetes.io/projected/911e63ba-b5de-4e5f-80e9-d62822cb8bac-kube-api-access-dqcpl\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.927021 4952 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/911e63ba-b5de-4e5f-80e9-d62822cb8bac-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:36 crc kubenswrapper[4952]: I1122 03:13:36.927030 4952 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/911e63ba-b5de-4e5f-80e9-d62822cb8bac-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.097194 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.112700 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.139274 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:13:37 crc kubenswrapper[4952]: E1122 03:13:37.139716 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="911e63ba-b5de-4e5f-80e9-d62822cb8bac" containerName="nova-metadata-metadata" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.139737 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="911e63ba-b5de-4e5f-80e9-d62822cb8bac" containerName="nova-metadata-metadata" Nov 22 03:13:37 crc kubenswrapper[4952]: E1122 03:13:37.139750 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04233ec3-e35d-4cb2-959d-2fad451655d2" containerName="nova-manage" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.139757 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="04233ec3-e35d-4cb2-959d-2fad451655d2" containerName="nova-manage" Nov 22 03:13:37 crc kubenswrapper[4952]: E1122 03:13:37.139775 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c23dea9-e99e-4527-902f-dc7280730cd3" containerName="dnsmasq-dns" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.139782 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c23dea9-e99e-4527-902f-dc7280730cd3" containerName="dnsmasq-dns" Nov 22 03:13:37 crc kubenswrapper[4952]: E1122 03:13:37.139794 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="911e63ba-b5de-4e5f-80e9-d62822cb8bac" containerName="nova-metadata-log" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.139802 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="911e63ba-b5de-4e5f-80e9-d62822cb8bac" containerName="nova-metadata-log" Nov 22 03:13:37 crc kubenswrapper[4952]: E1122 03:13:37.139819 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c23dea9-e99e-4527-902f-dc7280730cd3" containerName="init" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.139827 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c23dea9-e99e-4527-902f-dc7280730cd3" containerName="init" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.140006 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c23dea9-e99e-4527-902f-dc7280730cd3" containerName="dnsmasq-dns" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.140023 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="911e63ba-b5de-4e5f-80e9-d62822cb8bac" containerName="nova-metadata-metadata" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.140039 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="911e63ba-b5de-4e5f-80e9-d62822cb8bac" containerName="nova-metadata-log" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.140055 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="04233ec3-e35d-4cb2-959d-2fad451655d2" containerName="nova-manage" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.142783 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.146989 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.147357 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.166911 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.233127 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86-config-data\") pod \"nova-metadata-0\" (UID: \"7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86\") " pod="openstack/nova-metadata-0" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.233206 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86\") " pod="openstack/nova-metadata-0" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.233261 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjjwf\" (UniqueName: \"kubernetes.io/projected/7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86-kube-api-access-fjjwf\") pod \"nova-metadata-0\" (UID: \"7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86\") " pod="openstack/nova-metadata-0" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.233470 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86-logs\") pod \"nova-metadata-0\" (UID: \"7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86\") " pod="openstack/nova-metadata-0" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.233767 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86\") " pod="openstack/nova-metadata-0" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.335532 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86\") " pod="openstack/nova-metadata-0" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.335650 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86-config-data\") pod \"nova-metadata-0\" (UID: \"7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86\") " pod="openstack/nova-metadata-0" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.335686 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86\") " pod="openstack/nova-metadata-0" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.335719 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjjwf\" (UniqueName: \"kubernetes.io/projected/7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86-kube-api-access-fjjwf\") pod \"nova-metadata-0\" (UID: \"7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86\") " pod="openstack/nova-metadata-0" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.335758 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86-logs\") pod \"nova-metadata-0\" (UID: \"7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86\") " pod="openstack/nova-metadata-0" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.336265 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86-logs\") pod \"nova-metadata-0\" (UID: \"7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86\") " pod="openstack/nova-metadata-0" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.341406 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86\") " pod="openstack/nova-metadata-0" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.341718 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86\") " pod="openstack/nova-metadata-0" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.342266 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86-config-data\") pod \"nova-metadata-0\" (UID: \"7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86\") " pod="openstack/nova-metadata-0" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.356328 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjjwf\" (UniqueName: \"kubernetes.io/projected/7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86-kube-api-access-fjjwf\") pod \"nova-metadata-0\" (UID: \"7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86\") " pod="openstack/nova-metadata-0" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.469162 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.772631 4952 generic.go:334] "Generic (PLEG): container finished" podID="a7378007-7521-4913-bfba-431de1bc6b02" containerID="ca993581dc2188e48b1ef3447f29477af021cff4789c4ef9cb16ebf9eb215905" exitCode=0 Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.772730 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a7378007-7521-4913-bfba-431de1bc6b02","Type":"ContainerDied","Data":"ca993581dc2188e48b1ef3447f29477af021cff4789c4ef9cb16ebf9eb215905"} Nov 22 03:13:37 crc kubenswrapper[4952]: I1122 03:13:37.978336 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.061383 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7378007-7521-4913-bfba-431de1bc6b02-config-data\") pod \"a7378007-7521-4913-bfba-431de1bc6b02\" (UID: \"a7378007-7521-4913-bfba-431de1bc6b02\") " Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.062840 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7378007-7521-4913-bfba-431de1bc6b02-combined-ca-bundle\") pod \"a7378007-7521-4913-bfba-431de1bc6b02\" (UID: \"a7378007-7521-4913-bfba-431de1bc6b02\") " Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.062898 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nmfq\" (UniqueName: \"kubernetes.io/projected/a7378007-7521-4913-bfba-431de1bc6b02-kube-api-access-5nmfq\") pod \"a7378007-7521-4913-bfba-431de1bc6b02\" (UID: \"a7378007-7521-4913-bfba-431de1bc6b02\") " Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.070045 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7378007-7521-4913-bfba-431de1bc6b02-kube-api-access-5nmfq" (OuterVolumeSpecName: "kube-api-access-5nmfq") pod "a7378007-7521-4913-bfba-431de1bc6b02" (UID: "a7378007-7521-4913-bfba-431de1bc6b02"). InnerVolumeSpecName "kube-api-access-5nmfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.099902 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7378007-7521-4913-bfba-431de1bc6b02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7378007-7521-4913-bfba-431de1bc6b02" (UID: "a7378007-7521-4913-bfba-431de1bc6b02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.104216 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7378007-7521-4913-bfba-431de1bc6b02-config-data" (OuterVolumeSpecName: "config-data") pod "a7378007-7521-4913-bfba-431de1bc6b02" (UID: "a7378007-7521-4913-bfba-431de1bc6b02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.153992 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.165295 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7378007-7521-4913-bfba-431de1bc6b02-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.165594 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7378007-7521-4913-bfba-431de1bc6b02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.168725 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nmfq\" (UniqueName: \"kubernetes.io/projected/a7378007-7521-4913-bfba-431de1bc6b02-kube-api-access-5nmfq\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.547181 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="911e63ba-b5de-4e5f-80e9-d62822cb8bac" path="/var/lib/kubelet/pods/911e63ba-b5de-4e5f-80e9-d62822cb8bac/volumes" Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.789269 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86","Type":"ContainerStarted","Data":"31fe7e1b4c3ef8d8515e4a65d8330f7a70d56951bc254186fa4dce2cafb5b9ec"} Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.789360 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86","Type":"ContainerStarted","Data":"4421edf00f0778ae000ef248259c98f75ee3282147aa0856408f8c24045e3447"} Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.789375 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86","Type":"ContainerStarted","Data":"6cd0cbb049d9c883ea1c05392c730267265d3a85278008f8a5c526a19bbe71a2"} Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.793601 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a7378007-7521-4913-bfba-431de1bc6b02","Type":"ContainerDied","Data":"ac5444b657572d63aee63e1f3ca99de3555a8edb0815f9420b9e7a4a86233f75"} Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.793665 4952 scope.go:117] "RemoveContainer" containerID="ca993581dc2188e48b1ef3447f29477af021cff4789c4ef9cb16ebf9eb215905" Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.793670 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.822604 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.82257757 podStartE2EDuration="1.82257757s" podCreationTimestamp="2025-11-22 03:13:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:13:38.814539677 +0000 UTC m=+1183.120556960" watchObservedRunningTime="2025-11-22 03:13:38.82257757 +0000 UTC m=+1183.128594843" Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.848062 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.865071 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.882287 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:13:38 crc kubenswrapper[4952]: E1122 03:13:38.882948 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7378007-7521-4913-bfba-431de1bc6b02" containerName="nova-scheduler-scheduler" Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.882983 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7378007-7521-4913-bfba-431de1bc6b02" containerName="nova-scheduler-scheduler" Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.883265 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7378007-7521-4913-bfba-431de1bc6b02" containerName="nova-scheduler-scheduler" Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.885882 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.888306 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.895022 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.985780 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fhhm\" (UniqueName: \"kubernetes.io/projected/f34eb03c-9da3-48a6-8b9d-1c507aa538d2-kube-api-access-6fhhm\") pod \"nova-scheduler-0\" (UID: \"f34eb03c-9da3-48a6-8b9d-1c507aa538d2\") " pod="openstack/nova-scheduler-0" Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.985897 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34eb03c-9da3-48a6-8b9d-1c507aa538d2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f34eb03c-9da3-48a6-8b9d-1c507aa538d2\") " pod="openstack/nova-scheduler-0" Nov 22 03:13:38 crc kubenswrapper[4952]: I1122 03:13:38.985930 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34eb03c-9da3-48a6-8b9d-1c507aa538d2-config-data\") pod \"nova-scheduler-0\" (UID: \"f34eb03c-9da3-48a6-8b9d-1c507aa538d2\") " pod="openstack/nova-scheduler-0" Nov 22 03:13:39 crc kubenswrapper[4952]: I1122 03:13:39.089487 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fhhm\" (UniqueName: \"kubernetes.io/projected/f34eb03c-9da3-48a6-8b9d-1c507aa538d2-kube-api-access-6fhhm\") pod \"nova-scheduler-0\" (UID: \"f34eb03c-9da3-48a6-8b9d-1c507aa538d2\") " pod="openstack/nova-scheduler-0" Nov 22 03:13:39 crc kubenswrapper[4952]: I1122 03:13:39.089668 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34eb03c-9da3-48a6-8b9d-1c507aa538d2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f34eb03c-9da3-48a6-8b9d-1c507aa538d2\") " pod="openstack/nova-scheduler-0" Nov 22 03:13:39 crc kubenswrapper[4952]: I1122 03:13:39.089699 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34eb03c-9da3-48a6-8b9d-1c507aa538d2-config-data\") pod \"nova-scheduler-0\" (UID: \"f34eb03c-9da3-48a6-8b9d-1c507aa538d2\") " pod="openstack/nova-scheduler-0" Nov 22 03:13:39 crc kubenswrapper[4952]: I1122 03:13:39.097693 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34eb03c-9da3-48a6-8b9d-1c507aa538d2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f34eb03c-9da3-48a6-8b9d-1c507aa538d2\") " pod="openstack/nova-scheduler-0" Nov 22 03:13:39 crc kubenswrapper[4952]: I1122 03:13:39.097800 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34eb03c-9da3-48a6-8b9d-1c507aa538d2-config-data\") pod \"nova-scheduler-0\" (UID: \"f34eb03c-9da3-48a6-8b9d-1c507aa538d2\") " pod="openstack/nova-scheduler-0" Nov 22 03:13:39 crc kubenswrapper[4952]: I1122 03:13:39.128104 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fhhm\" (UniqueName: \"kubernetes.io/projected/f34eb03c-9da3-48a6-8b9d-1c507aa538d2-kube-api-access-6fhhm\") pod \"nova-scheduler-0\" (UID: \"f34eb03c-9da3-48a6-8b9d-1c507aa538d2\") " pod="openstack/nova-scheduler-0" Nov 22 03:13:39 crc kubenswrapper[4952]: I1122 03:13:39.217147 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 03:13:39 crc kubenswrapper[4952]: I1122 03:13:39.713099 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:13:39 crc kubenswrapper[4952]: I1122 03:13:39.812884 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f34eb03c-9da3-48a6-8b9d-1c507aa538d2","Type":"ContainerStarted","Data":"5cf47321c107900eaabc4e0a2229b38f51530c1cd60c91bf402adc8a7ec43b47"} Nov 22 03:13:39 crc kubenswrapper[4952]: I1122 03:13:39.816934 4952 generic.go:334] "Generic (PLEG): container finished" podID="1d01c1aa-13e7-4bab-bfc1-cf99d0c99629" containerID="736bab4512c8d69a580698ea5780ace6a0b015b183636a940eb865391b295209" exitCode=0 Nov 22 03:13:39 crc kubenswrapper[4952]: I1122 03:13:39.817013 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629","Type":"ContainerDied","Data":"736bab4512c8d69a580698ea5780ace6a0b015b183636a940eb865391b295209"} Nov 22 03:13:39 crc kubenswrapper[4952]: I1122 03:13:39.817051 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629","Type":"ContainerDied","Data":"d4fbb1233e5c0b9a6caf9817b03e310aa64cb8e39d8bcdd4bf6ee69ea6c404fd"} Nov 22 03:13:39 crc kubenswrapper[4952]: I1122 03:13:39.817067 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4fbb1233e5c0b9a6caf9817b03e310aa64cb8e39d8bcdd4bf6ee69ea6c404fd" Nov 22 03:13:39 crc kubenswrapper[4952]: I1122 03:13:39.872117 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.010968 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-public-tls-certs\") pod \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.011054 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-config-data\") pod \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.011093 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-combined-ca-bundle\") pod \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.011143 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-internal-tls-certs\") pod \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.011214 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4wlf\" (UniqueName: \"kubernetes.io/projected/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-kube-api-access-g4wlf\") pod \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.011306 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-logs\") pod \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\" (UID: \"1d01c1aa-13e7-4bab-bfc1-cf99d0c99629\") " Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.012208 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-logs" (OuterVolumeSpecName: "logs") pod "1d01c1aa-13e7-4bab-bfc1-cf99d0c99629" (UID: "1d01c1aa-13e7-4bab-bfc1-cf99d0c99629"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.017490 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-kube-api-access-g4wlf" (OuterVolumeSpecName: "kube-api-access-g4wlf") pod "1d01c1aa-13e7-4bab-bfc1-cf99d0c99629" (UID: "1d01c1aa-13e7-4bab-bfc1-cf99d0c99629"). InnerVolumeSpecName "kube-api-access-g4wlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.043687 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d01c1aa-13e7-4bab-bfc1-cf99d0c99629" (UID: "1d01c1aa-13e7-4bab-bfc1-cf99d0c99629"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.044114 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-config-data" (OuterVolumeSpecName: "config-data") pod "1d01c1aa-13e7-4bab-bfc1-cf99d0c99629" (UID: "1d01c1aa-13e7-4bab-bfc1-cf99d0c99629"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.074526 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1d01c1aa-13e7-4bab-bfc1-cf99d0c99629" (UID: "1d01c1aa-13e7-4bab-bfc1-cf99d0c99629"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.084873 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1d01c1aa-13e7-4bab-bfc1-cf99d0c99629" (UID: "1d01c1aa-13e7-4bab-bfc1-cf99d0c99629"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.113773 4952 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.113814 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.113826 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.113839 4952 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.113854 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4wlf\" (UniqueName: \"kubernetes.io/projected/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-kube-api-access-g4wlf\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.113869 4952 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.542786 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7378007-7521-4913-bfba-431de1bc6b02" path="/var/lib/kubelet/pods/a7378007-7521-4913-bfba-431de1bc6b02/volumes" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.833814 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.833810 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f34eb03c-9da3-48a6-8b9d-1c507aa538d2","Type":"ContainerStarted","Data":"4137abce006b328645a098390a10220e7cb084e5fd61afea55faa05881a7802e"} Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.860343 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.860321378 podStartE2EDuration="2.860321378s" podCreationTimestamp="2025-11-22 03:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:13:40.855616282 +0000 UTC m=+1185.161633615" watchObservedRunningTime="2025-11-22 03:13:40.860321378 +0000 UTC m=+1185.166338651" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.892084 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.900089 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.933599 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 03:13:40 crc kubenswrapper[4952]: E1122 03:13:40.934086 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d01c1aa-13e7-4bab-bfc1-cf99d0c99629" containerName="nova-api-log" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.934107 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d01c1aa-13e7-4bab-bfc1-cf99d0c99629" containerName="nova-api-log" Nov 22 03:13:40 crc kubenswrapper[4952]: E1122 03:13:40.934126 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d01c1aa-13e7-4bab-bfc1-cf99d0c99629" containerName="nova-api-api" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.934133 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d01c1aa-13e7-4bab-bfc1-cf99d0c99629" containerName="nova-api-api" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.934333 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d01c1aa-13e7-4bab-bfc1-cf99d0c99629" containerName="nova-api-log" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.934358 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d01c1aa-13e7-4bab-bfc1-cf99d0c99629" containerName="nova-api-api" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.937762 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.944942 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.945266 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.945536 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 03:13:40 crc kubenswrapper[4952]: I1122 03:13:40.952398 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.037518 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfxm7\" (UniqueName: \"kubernetes.io/projected/66852029-394b-4d8f-9104-709ac254def2-kube-api-access-dfxm7\") pod \"nova-api-0\" (UID: \"66852029-394b-4d8f-9104-709ac254def2\") " pod="openstack/nova-api-0" Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.037651 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66852029-394b-4d8f-9104-709ac254def2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"66852029-394b-4d8f-9104-709ac254def2\") " pod="openstack/nova-api-0" Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.037690 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66852029-394b-4d8f-9104-709ac254def2-public-tls-certs\") pod \"nova-api-0\" (UID: \"66852029-394b-4d8f-9104-709ac254def2\") " pod="openstack/nova-api-0" Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.037728 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66852029-394b-4d8f-9104-709ac254def2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"66852029-394b-4d8f-9104-709ac254def2\") " pod="openstack/nova-api-0" Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.037781 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66852029-394b-4d8f-9104-709ac254def2-logs\") pod \"nova-api-0\" (UID: \"66852029-394b-4d8f-9104-709ac254def2\") " pod="openstack/nova-api-0" Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.037892 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66852029-394b-4d8f-9104-709ac254def2-config-data\") pod \"nova-api-0\" (UID: \"66852029-394b-4d8f-9104-709ac254def2\") " pod="openstack/nova-api-0" Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.140195 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfxm7\" (UniqueName: \"kubernetes.io/projected/66852029-394b-4d8f-9104-709ac254def2-kube-api-access-dfxm7\") pod \"nova-api-0\" (UID: \"66852029-394b-4d8f-9104-709ac254def2\") " pod="openstack/nova-api-0" Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.140381 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66852029-394b-4d8f-9104-709ac254def2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"66852029-394b-4d8f-9104-709ac254def2\") " pod="openstack/nova-api-0" Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.140431 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66852029-394b-4d8f-9104-709ac254def2-public-tls-certs\") pod \"nova-api-0\" (UID: \"66852029-394b-4d8f-9104-709ac254def2\") " pod="openstack/nova-api-0" Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.140477 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66852029-394b-4d8f-9104-709ac254def2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"66852029-394b-4d8f-9104-709ac254def2\") " pod="openstack/nova-api-0" Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.140535 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66852029-394b-4d8f-9104-709ac254def2-logs\") pod \"nova-api-0\" (UID: \"66852029-394b-4d8f-9104-709ac254def2\") " pod="openstack/nova-api-0" Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.140672 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66852029-394b-4d8f-9104-709ac254def2-config-data\") pod \"nova-api-0\" (UID: \"66852029-394b-4d8f-9104-709ac254def2\") " pod="openstack/nova-api-0" Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.142126 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66852029-394b-4d8f-9104-709ac254def2-logs\") pod \"nova-api-0\" (UID: \"66852029-394b-4d8f-9104-709ac254def2\") " pod="openstack/nova-api-0" Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.146500 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66852029-394b-4d8f-9104-709ac254def2-public-tls-certs\") pod \"nova-api-0\" (UID: \"66852029-394b-4d8f-9104-709ac254def2\") " pod="openstack/nova-api-0" Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.147476 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66852029-394b-4d8f-9104-709ac254def2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"66852029-394b-4d8f-9104-709ac254def2\") " pod="openstack/nova-api-0" Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.148059 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66852029-394b-4d8f-9104-709ac254def2-config-data\") pod \"nova-api-0\" (UID: \"66852029-394b-4d8f-9104-709ac254def2\") " pod="openstack/nova-api-0" Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.157114 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66852029-394b-4d8f-9104-709ac254def2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"66852029-394b-4d8f-9104-709ac254def2\") " pod="openstack/nova-api-0" Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.165088 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfxm7\" (UniqueName: \"kubernetes.io/projected/66852029-394b-4d8f-9104-709ac254def2-kube-api-access-dfxm7\") pod \"nova-api-0\" (UID: \"66852029-394b-4d8f-9104-709ac254def2\") " pod="openstack/nova-api-0" Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.260847 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.792051 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:13:41 crc kubenswrapper[4952]: I1122 03:13:41.874408 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66852029-394b-4d8f-9104-709ac254def2","Type":"ContainerStarted","Data":"aa2c19842fb937a851a81a956acbd66a0078f59a6ca4abea653c32afb894f678"} Nov 22 03:13:42 crc kubenswrapper[4952]: I1122 03:13:42.469572 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 03:13:42 crc kubenswrapper[4952]: I1122 03:13:42.470095 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 03:13:42 crc kubenswrapper[4952]: I1122 03:13:42.543966 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d01c1aa-13e7-4bab-bfc1-cf99d0c99629" path="/var/lib/kubelet/pods/1d01c1aa-13e7-4bab-bfc1-cf99d0c99629/volumes" Nov 22 03:13:42 crc kubenswrapper[4952]: I1122 03:13:42.888126 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66852029-394b-4d8f-9104-709ac254def2","Type":"ContainerStarted","Data":"c9e0cc6b6efda4e3bd06edfd8ac94bd71ddccbb628db3c17605e289a97eabe26"} Nov 22 03:13:42 crc kubenswrapper[4952]: I1122 03:13:42.889728 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66852029-394b-4d8f-9104-709ac254def2","Type":"ContainerStarted","Data":"915794640de051a01b07e85ea6c66d9be1fc1b7d7bf760db979f51682351a11f"} Nov 22 03:13:42 crc kubenswrapper[4952]: I1122 03:13:42.921408 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.921377457 podStartE2EDuration="2.921377457s" podCreationTimestamp="2025-11-22 03:13:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:13:42.914123114 +0000 UTC m=+1187.220140387" watchObservedRunningTime="2025-11-22 03:13:42.921377457 +0000 UTC m=+1187.227394730" Nov 22 03:13:44 crc kubenswrapper[4952]: I1122 03:13:44.217488 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 22 03:13:47 crc kubenswrapper[4952]: I1122 03:13:47.470321 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 03:13:47 crc kubenswrapper[4952]: I1122 03:13:47.470958 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 03:13:48 crc kubenswrapper[4952]: I1122 03:13:48.508873 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.185:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 03:13:48 crc kubenswrapper[4952]: I1122 03:13:48.508925 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.185:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 03:13:49 crc kubenswrapper[4952]: I1122 03:13:49.217409 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 22 03:13:49 crc kubenswrapper[4952]: I1122 03:13:49.251952 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 22 03:13:50 crc kubenswrapper[4952]: I1122 03:13:50.023140 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 22 03:13:51 crc kubenswrapper[4952]: I1122 03:13:51.262887 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 03:13:51 crc kubenswrapper[4952]: I1122 03:13:51.263736 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 03:13:52 crc kubenswrapper[4952]: I1122 03:13:52.054099 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 22 03:13:52 crc kubenswrapper[4952]: I1122 03:13:52.297899 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="66852029-394b-4d8f-9104-709ac254def2" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.187:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 03:13:52 crc kubenswrapper[4952]: I1122 03:13:52.297879 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="66852029-394b-4d8f-9104-709ac254def2" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.187:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 03:13:57 crc kubenswrapper[4952]: I1122 03:13:57.482171 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 03:13:57 crc kubenswrapper[4952]: I1122 03:13:57.483315 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 03:13:57 crc kubenswrapper[4952]: I1122 03:13:57.494404 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 03:13:57 crc kubenswrapper[4952]: I1122 03:13:57.498449 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 03:14:01 crc kubenswrapper[4952]: I1122 03:14:01.272448 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 03:14:01 crc kubenswrapper[4952]: I1122 03:14:01.275490 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 03:14:01 crc kubenswrapper[4952]: I1122 03:14:01.275868 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 03:14:01 crc kubenswrapper[4952]: I1122 03:14:01.286649 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 03:14:02 crc kubenswrapper[4952]: I1122 03:14:02.134207 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 03:14:02 crc kubenswrapper[4952]: I1122 03:14:02.143967 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 03:14:11 crc kubenswrapper[4952]: I1122 03:14:11.217620 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 03:14:12 crc kubenswrapper[4952]: I1122 03:14:12.168349 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 03:14:16 crc kubenswrapper[4952]: I1122 03:14:16.051732 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="74351431-f23e-45c3-a8a5-08143737551a" containerName="rabbitmq" containerID="cri-o://a534a8defa6a1441dddd764b128e65631990ebc6fd61c14f9597fc18ea6815df" gracePeriod=604796 Nov 22 03:14:17 crc kubenswrapper[4952]: I1122 03:14:17.051142 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="74351431-f23e-45c3-a8a5-08143737551a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Nov 22 03:14:17 crc kubenswrapper[4952]: I1122 03:14:17.231504 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="16617513-df98-4123-b612-9bc83023f977" containerName="rabbitmq" containerID="cri-o://e1e8d2e02112a1c0b13b8d6c3218897c8a96f904b0b23a4220d6ee33677d651c" gracePeriod=604795 Nov 22 03:14:17 crc kubenswrapper[4952]: I1122 03:14:17.406849 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="16617513-df98-4123-b612-9bc83023f977" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Nov 22 03:14:22 crc kubenswrapper[4952]: I1122 03:14:22.365663 4952 generic.go:334] "Generic (PLEG): container finished" podID="74351431-f23e-45c3-a8a5-08143737551a" containerID="a534a8defa6a1441dddd764b128e65631990ebc6fd61c14f9597fc18ea6815df" exitCode=0 Nov 22 03:14:22 crc kubenswrapper[4952]: I1122 03:14:22.365714 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74351431-f23e-45c3-a8a5-08143737551a","Type":"ContainerDied","Data":"a534a8defa6a1441dddd764b128e65631990ebc6fd61c14f9597fc18ea6815df"} Nov 22 03:14:22 crc kubenswrapper[4952]: I1122 03:14:22.858795 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.054538 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74351431-f23e-45c3-a8a5-08143737551a-config-data\") pod \"74351431-f23e-45c3-a8a5-08143737551a\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.054906 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-erlang-cookie\") pod \"74351431-f23e-45c3-a8a5-08143737551a\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.055080 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-plugins\") pod \"74351431-f23e-45c3-a8a5-08143737551a\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.055191 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-tls\") pod \"74351431-f23e-45c3-a8a5-08143737551a\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.055316 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74351431-f23e-45c3-a8a5-08143737551a-server-conf\") pod \"74351431-f23e-45c3-a8a5-08143737551a\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.055420 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74351431-f23e-45c3-a8a5-08143737551a-pod-info\") pod \"74351431-f23e-45c3-a8a5-08143737551a\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.055478 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "74351431-f23e-45c3-a8a5-08143737551a" (UID: "74351431-f23e-45c3-a8a5-08143737551a"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.055627 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"74351431-f23e-45c3-a8a5-08143737551a\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.055729 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "74351431-f23e-45c3-a8a5-08143737551a" (UID: "74351431-f23e-45c3-a8a5-08143737551a"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.055848 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74351431-f23e-45c3-a8a5-08143737551a-erlang-cookie-secret\") pod \"74351431-f23e-45c3-a8a5-08143737551a\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.056074 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-confd\") pod \"74351431-f23e-45c3-a8a5-08143737551a\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.056196 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cgj6\" (UniqueName: \"kubernetes.io/projected/74351431-f23e-45c3-a8a5-08143737551a-kube-api-access-8cgj6\") pod \"74351431-f23e-45c3-a8a5-08143737551a\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.056298 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74351431-f23e-45c3-a8a5-08143737551a-plugins-conf\") pod \"74351431-f23e-45c3-a8a5-08143737551a\" (UID: \"74351431-f23e-45c3-a8a5-08143737551a\") " Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.056632 4952 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.056731 4952 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.057220 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74351431-f23e-45c3-a8a5-08143737551a-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "74351431-f23e-45c3-a8a5-08143737551a" (UID: "74351431-f23e-45c3-a8a5-08143737551a"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.067779 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/74351431-f23e-45c3-a8a5-08143737551a-pod-info" (OuterVolumeSpecName: "pod-info") pod "74351431-f23e-45c3-a8a5-08143737551a" (UID: "74351431-f23e-45c3-a8a5-08143737551a"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.069188 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "74351431-f23e-45c3-a8a5-08143737551a" (UID: "74351431-f23e-45c3-a8a5-08143737551a"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.070760 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74351431-f23e-45c3-a8a5-08143737551a-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "74351431-f23e-45c3-a8a5-08143737551a" (UID: "74351431-f23e-45c3-a8a5-08143737551a"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.074694 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74351431-f23e-45c3-a8a5-08143737551a-kube-api-access-8cgj6" (OuterVolumeSpecName: "kube-api-access-8cgj6") pod "74351431-f23e-45c3-a8a5-08143737551a" (UID: "74351431-f23e-45c3-a8a5-08143737551a"). InnerVolumeSpecName "kube-api-access-8cgj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.077148 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "74351431-f23e-45c3-a8a5-08143737551a" (UID: "74351431-f23e-45c3-a8a5-08143737551a"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.093286 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74351431-f23e-45c3-a8a5-08143737551a-config-data" (OuterVolumeSpecName: "config-data") pod "74351431-f23e-45c3-a8a5-08143737551a" (UID: "74351431-f23e-45c3-a8a5-08143737551a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.152568 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74351431-f23e-45c3-a8a5-08143737551a-server-conf" (OuterVolumeSpecName: "server-conf") pod "74351431-f23e-45c3-a8a5-08143737551a" (UID: "74351431-f23e-45c3-a8a5-08143737551a"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.162715 4952 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.162764 4952 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74351431-f23e-45c3-a8a5-08143737551a-server-conf\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.162780 4952 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74351431-f23e-45c3-a8a5-08143737551a-pod-info\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.162818 4952 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.162832 4952 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74351431-f23e-45c3-a8a5-08143737551a-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.162845 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cgj6\" (UniqueName: \"kubernetes.io/projected/74351431-f23e-45c3-a8a5-08143737551a-kube-api-access-8cgj6\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.162863 4952 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74351431-f23e-45c3-a8a5-08143737551a-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.162874 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74351431-f23e-45c3-a8a5-08143737551a-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.188323 4952 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.206881 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "74351431-f23e-45c3-a8a5-08143737551a" (UID: "74351431-f23e-45c3-a8a5-08143737551a"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.264733 4952 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74351431-f23e-45c3-a8a5-08143737551a-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.264789 4952 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.383285 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74351431-f23e-45c3-a8a5-08143737551a","Type":"ContainerDied","Data":"15e079ab10ccec60b459174d579111f87cabba60632fa99feb7199d34f507d4a"} Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.383352 4952 scope.go:117] "RemoveContainer" containerID="a534a8defa6a1441dddd764b128e65631990ebc6fd61c14f9597fc18ea6815df" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.383378 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.437790 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.440890 4952 scope.go:117] "RemoveContainer" containerID="16647ee3b5f63944a1cff2cb0ee6ea097461a6599cbd4eb83545d89f630e239c" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.449240 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.480255 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 03:14:23 crc kubenswrapper[4952]: E1122 03:14:23.480860 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74351431-f23e-45c3-a8a5-08143737551a" containerName="rabbitmq" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.480885 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="74351431-f23e-45c3-a8a5-08143737551a" containerName="rabbitmq" Nov 22 03:14:23 crc kubenswrapper[4952]: E1122 03:14:23.480908 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74351431-f23e-45c3-a8a5-08143737551a" containerName="setup-container" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.480917 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="74351431-f23e-45c3-a8a5-08143737551a" containerName="setup-container" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.481169 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="74351431-f23e-45c3-a8a5-08143737551a" containerName="rabbitmq" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.482754 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.489468 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.489699 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.489852 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-pnpt7" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.495261 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.495723 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.495777 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.495965 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.516368 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.689778 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f71324dd-d6ac-457c-83be-541d1afa5ec4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.690307 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.690330 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f71324dd-d6ac-457c-83be-541d1afa5ec4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.690423 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f71324dd-d6ac-457c-83be-541d1afa5ec4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.690468 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f71324dd-d6ac-457c-83be-541d1afa5ec4-config-data\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.690490 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f71324dd-d6ac-457c-83be-541d1afa5ec4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.690522 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f71324dd-d6ac-457c-83be-541d1afa5ec4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.690569 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f71324dd-d6ac-457c-83be-541d1afa5ec4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.690590 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f71324dd-d6ac-457c-83be-541d1afa5ec4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.690622 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8l7k\" (UniqueName: \"kubernetes.io/projected/f71324dd-d6ac-457c-83be-541d1afa5ec4-kube-api-access-s8l7k\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.690711 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f71324dd-d6ac-457c-83be-541d1afa5ec4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.793068 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f71324dd-d6ac-457c-83be-541d1afa5ec4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.793163 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f71324dd-d6ac-457c-83be-541d1afa5ec4-config-data\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.793210 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f71324dd-d6ac-457c-83be-541d1afa5ec4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.793237 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f71324dd-d6ac-457c-83be-541d1afa5ec4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.793285 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f71324dd-d6ac-457c-83be-541d1afa5ec4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.793303 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f71324dd-d6ac-457c-83be-541d1afa5ec4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.793330 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8l7k\" (UniqueName: \"kubernetes.io/projected/f71324dd-d6ac-457c-83be-541d1afa5ec4-kube-api-access-s8l7k\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.793398 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f71324dd-d6ac-457c-83be-541d1afa5ec4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.793439 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f71324dd-d6ac-457c-83be-541d1afa5ec4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.793459 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.793474 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f71324dd-d6ac-457c-83be-541d1afa5ec4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.795467 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f71324dd-d6ac-457c-83be-541d1afa5ec4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.795613 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f71324dd-d6ac-457c-83be-541d1afa5ec4-config-data\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.795957 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f71324dd-d6ac-457c-83be-541d1afa5ec4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.796249 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f71324dd-d6ac-457c-83be-541d1afa5ec4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.796494 4952 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.797714 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f71324dd-d6ac-457c-83be-541d1afa5ec4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.799707 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f71324dd-d6ac-457c-83be-541d1afa5ec4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.799871 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f71324dd-d6ac-457c-83be-541d1afa5ec4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.800501 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f71324dd-d6ac-457c-83be-541d1afa5ec4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.802245 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f71324dd-d6ac-457c-83be-541d1afa5ec4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.817722 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8l7k\" (UniqueName: \"kubernetes.io/projected/f71324dd-d6ac-457c-83be-541d1afa5ec4-kube-api-access-s8l7k\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.828595 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"f71324dd-d6ac-457c-83be-541d1afa5ec4\") " pod="openstack/rabbitmq-server-0" Nov 22 03:14:23 crc kubenswrapper[4952]: I1122 03:14:23.936064 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.034064 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.114631 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/16617513-df98-4123-b612-9bc83023f977-rabbitmq-erlang-cookie\") pod \"16617513-df98-4123-b612-9bc83023f977\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.114711 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjvdp\" (UniqueName: \"kubernetes.io/projected/16617513-df98-4123-b612-9bc83023f977-kube-api-access-tjvdp\") pod \"16617513-df98-4123-b612-9bc83023f977\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.114737 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/16617513-df98-4123-b612-9bc83023f977-rabbitmq-confd\") pod \"16617513-df98-4123-b612-9bc83023f977\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.114787 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/16617513-df98-4123-b612-9bc83023f977-erlang-cookie-secret\") pod \"16617513-df98-4123-b612-9bc83023f977\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.114869 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/16617513-df98-4123-b612-9bc83023f977-rabbitmq-tls\") pod \"16617513-df98-4123-b612-9bc83023f977\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.114939 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"16617513-df98-4123-b612-9bc83023f977\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.115008 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/16617513-df98-4123-b612-9bc83023f977-pod-info\") pod \"16617513-df98-4123-b612-9bc83023f977\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.115050 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/16617513-df98-4123-b612-9bc83023f977-server-conf\") pod \"16617513-df98-4123-b612-9bc83023f977\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.115076 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16617513-df98-4123-b612-9bc83023f977-config-data\") pod \"16617513-df98-4123-b612-9bc83023f977\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.115105 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/16617513-df98-4123-b612-9bc83023f977-rabbitmq-plugins\") pod \"16617513-df98-4123-b612-9bc83023f977\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.115134 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/16617513-df98-4123-b612-9bc83023f977-plugins-conf\") pod \"16617513-df98-4123-b612-9bc83023f977\" (UID: \"16617513-df98-4123-b612-9bc83023f977\") " Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.117368 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16617513-df98-4123-b612-9bc83023f977-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "16617513-df98-4123-b612-9bc83023f977" (UID: "16617513-df98-4123-b612-9bc83023f977"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.117798 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16617513-df98-4123-b612-9bc83023f977-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "16617513-df98-4123-b612-9bc83023f977" (UID: "16617513-df98-4123-b612-9bc83023f977"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.127987 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16617513-df98-4123-b612-9bc83023f977-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "16617513-df98-4123-b612-9bc83023f977" (UID: "16617513-df98-4123-b612-9bc83023f977"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.131275 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/16617513-df98-4123-b612-9bc83023f977-pod-info" (OuterVolumeSpecName: "pod-info") pod "16617513-df98-4123-b612-9bc83023f977" (UID: "16617513-df98-4123-b612-9bc83023f977"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.132605 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16617513-df98-4123-b612-9bc83023f977-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "16617513-df98-4123-b612-9bc83023f977" (UID: "16617513-df98-4123-b612-9bc83023f977"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.138307 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16617513-df98-4123-b612-9bc83023f977-kube-api-access-tjvdp" (OuterVolumeSpecName: "kube-api-access-tjvdp") pod "16617513-df98-4123-b612-9bc83023f977" (UID: "16617513-df98-4123-b612-9bc83023f977"). InnerVolumeSpecName "kube-api-access-tjvdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.142559 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "16617513-df98-4123-b612-9bc83023f977" (UID: "16617513-df98-4123-b612-9bc83023f977"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.147773 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16617513-df98-4123-b612-9bc83023f977-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "16617513-df98-4123-b612-9bc83023f977" (UID: "16617513-df98-4123-b612-9bc83023f977"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.190586 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16617513-df98-4123-b612-9bc83023f977-config-data" (OuterVolumeSpecName: "config-data") pod "16617513-df98-4123-b612-9bc83023f977" (UID: "16617513-df98-4123-b612-9bc83023f977"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.218045 4952 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/16617513-df98-4123-b612-9bc83023f977-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.218076 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjvdp\" (UniqueName: \"kubernetes.io/projected/16617513-df98-4123-b612-9bc83023f977-kube-api-access-tjvdp\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.218087 4952 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/16617513-df98-4123-b612-9bc83023f977-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.218101 4952 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/16617513-df98-4123-b612-9bc83023f977-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.218144 4952 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.218154 4952 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/16617513-df98-4123-b612-9bc83023f977-pod-info\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.218163 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16617513-df98-4123-b612-9bc83023f977-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.218172 4952 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/16617513-df98-4123-b612-9bc83023f977-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.218199 4952 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/16617513-df98-4123-b612-9bc83023f977-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.234768 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16617513-df98-4123-b612-9bc83023f977-server-conf" (OuterVolumeSpecName: "server-conf") pod "16617513-df98-4123-b612-9bc83023f977" (UID: "16617513-df98-4123-b612-9bc83023f977"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.253967 4952 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.321580 4952 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.321611 4952 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/16617513-df98-4123-b612-9bc83023f977-server-conf\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.323706 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16617513-df98-4123-b612-9bc83023f977-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "16617513-df98-4123-b612-9bc83023f977" (UID: "16617513-df98-4123-b612-9bc83023f977"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.396848 4952 generic.go:334] "Generic (PLEG): container finished" podID="16617513-df98-4123-b612-9bc83023f977" containerID="e1e8d2e02112a1c0b13b8d6c3218897c8a96f904b0b23a4220d6ee33677d651c" exitCode=0 Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.397089 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"16617513-df98-4123-b612-9bc83023f977","Type":"ContainerDied","Data":"e1e8d2e02112a1c0b13b8d6c3218897c8a96f904b0b23a4220d6ee33677d651c"} Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.397442 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"16617513-df98-4123-b612-9bc83023f977","Type":"ContainerDied","Data":"328b6d07741ff453284284ebe749afe3a53affc8f28f722d102d5a288ad333ea"} Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.397467 4952 scope.go:117] "RemoveContainer" containerID="e1e8d2e02112a1c0b13b8d6c3218897c8a96f904b0b23a4220d6ee33677d651c" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.397189 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.423577 4952 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/16617513-df98-4123-b612-9bc83023f977-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.433211 4952 scope.go:117] "RemoveContainer" containerID="1149aa85b3a1ea05d13aa42a8717bca85adc7bd1bb7d63db8328cd3d7f41c1ff" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.437249 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.453131 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.476590 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 03:14:24 crc kubenswrapper[4952]: E1122 03:14:24.477097 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16617513-df98-4123-b612-9bc83023f977" containerName="rabbitmq" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.477119 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="16617513-df98-4123-b612-9bc83023f977" containerName="rabbitmq" Nov 22 03:14:24 crc kubenswrapper[4952]: E1122 03:14:24.477153 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16617513-df98-4123-b612-9bc83023f977" containerName="setup-container" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.477161 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="16617513-df98-4123-b612-9bc83023f977" containerName="setup-container" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.477330 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="16617513-df98-4123-b612-9bc83023f977" containerName="rabbitmq" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.478401 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.482407 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.483537 4952 scope.go:117] "RemoveContainer" containerID="e1e8d2e02112a1c0b13b8d6c3218897c8a96f904b0b23a4220d6ee33677d651c" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.483755 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.483959 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.484088 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-79mtl" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.484118 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.484524 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 22 03:14:24 crc kubenswrapper[4952]: E1122 03:14:24.484715 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1e8d2e02112a1c0b13b8d6c3218897c8a96f904b0b23a4220d6ee33677d651c\": container with ID starting with e1e8d2e02112a1c0b13b8d6c3218897c8a96f904b0b23a4220d6ee33677d651c not found: ID does not exist" containerID="e1e8d2e02112a1c0b13b8d6c3218897c8a96f904b0b23a4220d6ee33677d651c" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.484757 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e8d2e02112a1c0b13b8d6c3218897c8a96f904b0b23a4220d6ee33677d651c"} err="failed to get container status \"e1e8d2e02112a1c0b13b8d6c3218897c8a96f904b0b23a4220d6ee33677d651c\": rpc error: code = NotFound desc = could not find container \"e1e8d2e02112a1c0b13b8d6c3218897c8a96f904b0b23a4220d6ee33677d651c\": container with ID starting with e1e8d2e02112a1c0b13b8d6c3218897c8a96f904b0b23a4220d6ee33677d651c not found: ID does not exist" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.484790 4952 scope.go:117] "RemoveContainer" containerID="1149aa85b3a1ea05d13aa42a8717bca85adc7bd1bb7d63db8328cd3d7f41c1ff" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.485140 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 22 03:14:24 crc kubenswrapper[4952]: E1122 03:14:24.486254 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1149aa85b3a1ea05d13aa42a8717bca85adc7bd1bb7d63db8328cd3d7f41c1ff\": container with ID starting with 1149aa85b3a1ea05d13aa42a8717bca85adc7bd1bb7d63db8328cd3d7f41c1ff not found: ID does not exist" containerID="1149aa85b3a1ea05d13aa42a8717bca85adc7bd1bb7d63db8328cd3d7f41c1ff" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.486291 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1149aa85b3a1ea05d13aa42a8717bca85adc7bd1bb7d63db8328cd3d7f41c1ff"} err="failed to get container status \"1149aa85b3a1ea05d13aa42a8717bca85adc7bd1bb7d63db8328cd3d7f41c1ff\": rpc error: code = NotFound desc = could not find container \"1149aa85b3a1ea05d13aa42a8717bca85adc7bd1bb7d63db8328cd3d7f41c1ff\": container with ID starting with 1149aa85b3a1ea05d13aa42a8717bca85adc7bd1bb7d63db8328cd3d7f41c1ff not found: ID does not exist" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.511679 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.553036 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16617513-df98-4123-b612-9bc83023f977" path="/var/lib/kubelet/pods/16617513-df98-4123-b612-9bc83023f977/volumes" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.553994 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74351431-f23e-45c3-a8a5-08143737551a" path="/var/lib/kubelet/pods/74351431-f23e-45c3-a8a5-08143737551a/volumes" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.554811 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.627311 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.627366 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.627444 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.627476 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.627575 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.627640 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntbkf\" (UniqueName: \"kubernetes.io/projected/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-kube-api-access-ntbkf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.627771 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.627886 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.627968 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.627991 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.628023 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.730147 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.730205 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.730233 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntbkf\" (UniqueName: \"kubernetes.io/projected/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-kube-api-access-ntbkf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.730275 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.730319 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.730351 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.730390 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.730409 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.730444 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.730467 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.730509 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.730822 4952 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.731024 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.732030 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.732342 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.732570 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.733095 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.735813 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.735859 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.737236 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.737835 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.752145 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntbkf\" (UniqueName: \"kubernetes.io/projected/8f72c2a8-8441-4469-a7aa-d87b27a7dd6a-kube-api-access-ntbkf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.757084 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:24 crc kubenswrapper[4952]: I1122 03:14:24.828783 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:25 crc kubenswrapper[4952]: I1122 03:14:25.337974 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 03:14:25 crc kubenswrapper[4952]: I1122 03:14:25.426480 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f71324dd-d6ac-457c-83be-541d1afa5ec4","Type":"ContainerStarted","Data":"8c40b9e6052e4cfc06d43f94e34a1c87e64e4d9ea54fddd315222cf425bbba9d"} Nov 22 03:14:25 crc kubenswrapper[4952]: I1122 03:14:25.433363 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a","Type":"ContainerStarted","Data":"703ce640b5c3a9eebdee62e190fe1bab14a5cc543420562eb8302aea00c9ae79"} Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.437974 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-nj89b"] Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.441139 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.447880 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-nj89b"] Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.448526 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.461123 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a","Type":"ContainerStarted","Data":"f3adc2c91ad0274447f6a6f1329f90f4673e629ec6233bc2fffcd4a78c76d8f0"} Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.462974 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f71324dd-d6ac-457c-83be-541d1afa5ec4","Type":"ContainerStarted","Data":"04dad3fe46823507d65c5bbcba518353cfe64bdcbc3e3eabfe056b3e76686753"} Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.621165 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpljc\" (UniqueName: \"kubernetes.io/projected/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-kube-api-access-qpljc\") pod \"dnsmasq-dns-6447ccbd8f-nj89b\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.621812 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-nj89b\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.623108 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-nj89b\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.623286 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-nj89b\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.624471 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-config\") pod \"dnsmasq-dns-6447ccbd8f-nj89b\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.624706 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-nj89b\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.727485 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-config\") pod \"dnsmasq-dns-6447ccbd8f-nj89b\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.727597 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-nj89b\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.727698 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpljc\" (UniqueName: \"kubernetes.io/projected/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-kube-api-access-qpljc\") pod \"dnsmasq-dns-6447ccbd8f-nj89b\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.727726 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-nj89b\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.727747 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-nj89b\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.727773 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-nj89b\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.728966 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-nj89b\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.729304 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-nj89b\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.729304 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-nj89b\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.729304 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-nj89b\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.729773 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-config\") pod \"dnsmasq-dns-6447ccbd8f-nj89b\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.757919 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpljc\" (UniqueName: \"kubernetes.io/projected/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-kube-api-access-qpljc\") pod \"dnsmasq-dns-6447ccbd8f-nj89b\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:27 crc kubenswrapper[4952]: I1122 03:14:27.762331 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:28 crc kubenswrapper[4952]: I1122 03:14:28.304868 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-nj89b"] Nov 22 03:14:28 crc kubenswrapper[4952]: W1122 03:14:28.310883 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3c2756c_4a3d_4ea3_9b2f_76774ea195a9.slice/crio-59d362ff3e4161a671451dd23783e0ee7929663a384ebd60d4afad29bcf821c7 WatchSource:0}: Error finding container 59d362ff3e4161a671451dd23783e0ee7929663a384ebd60d4afad29bcf821c7: Status 404 returned error can't find the container with id 59d362ff3e4161a671451dd23783e0ee7929663a384ebd60d4afad29bcf821c7 Nov 22 03:14:28 crc kubenswrapper[4952]: I1122 03:14:28.473786 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" event={"ID":"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9","Type":"ContainerStarted","Data":"59d362ff3e4161a671451dd23783e0ee7929663a384ebd60d4afad29bcf821c7"} Nov 22 03:14:29 crc kubenswrapper[4952]: I1122 03:14:29.487852 4952 generic.go:334] "Generic (PLEG): container finished" podID="a3c2756c-4a3d-4ea3-9b2f-76774ea195a9" containerID="7d85312346b2305a83c3a728c3df8638e883d70455e27d38da3360f76d803d0d" exitCode=0 Nov 22 03:14:29 crc kubenswrapper[4952]: I1122 03:14:29.487905 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" event={"ID":"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9","Type":"ContainerDied","Data":"7d85312346b2305a83c3a728c3df8638e883d70455e27d38da3360f76d803d0d"} Nov 22 03:14:30 crc kubenswrapper[4952]: I1122 03:14:30.507649 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" event={"ID":"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9","Type":"ContainerStarted","Data":"0c47826f64a2845bfc4f9be9185c5c3ea688f1ef05f6146a05f9ad4b4f81a521"} Nov 22 03:14:30 crc kubenswrapper[4952]: I1122 03:14:30.508252 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:30 crc kubenswrapper[4952]: I1122 03:14:30.548673 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" podStartSLOduration=3.548640669 podStartE2EDuration="3.548640669s" podCreationTimestamp="2025-11-22 03:14:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:14:30.548199877 +0000 UTC m=+1234.854217180" watchObservedRunningTime="2025-11-22 03:14:30.548640669 +0000 UTC m=+1234.854658012" Nov 22 03:14:37 crc kubenswrapper[4952]: I1122 03:14:37.763873 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:37 crc kubenswrapper[4952]: I1122 03:14:37.847309 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-pz9jv"] Nov 22 03:14:37 crc kubenswrapper[4952]: I1122 03:14:37.849223 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" podUID="ce5b6b87-7965-4f3a-a929-c5495ef9176d" containerName="dnsmasq-dns" containerID="cri-o://5926563b4d2036cb1cf10742c5a9c8782827f4f46fc0fc07e30654425cae110b" gracePeriod=10 Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.039693 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-sxwx6"] Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.041811 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.067789 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-sxwx6"] Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.199057 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-sxwx6\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.199727 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-sxwx6\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.200044 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-sxwx6\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.200105 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6xdc\" (UniqueName: \"kubernetes.io/projected/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-kube-api-access-n6xdc\") pod \"dnsmasq-dns-864d5fc68c-sxwx6\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.200134 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-sxwx6\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.200159 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-config\") pod \"dnsmasq-dns-864d5fc68c-sxwx6\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.302161 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-sxwx6\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.302217 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6xdc\" (UniqueName: \"kubernetes.io/projected/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-kube-api-access-n6xdc\") pod \"dnsmasq-dns-864d5fc68c-sxwx6\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.302285 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-sxwx6\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.302320 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-config\") pod \"dnsmasq-dns-864d5fc68c-sxwx6\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.302419 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-sxwx6\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.302444 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-sxwx6\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.303948 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-sxwx6\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.304306 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-sxwx6\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.304440 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-sxwx6\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.304566 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-config\") pod \"dnsmasq-dns-864d5fc68c-sxwx6\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.306684 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-sxwx6\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.333744 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6xdc\" (UniqueName: \"kubernetes.io/projected/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-kube-api-access-n6xdc\") pod \"dnsmasq-dns-864d5fc68c-sxwx6\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.393512 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.547854 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.610487 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-dns-svc\") pod \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\" (UID: \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\") " Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.610591 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-ovsdbserver-sb\") pod \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\" (UID: \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\") " Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.610689 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-config\") pod \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\" (UID: \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\") " Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.610733 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2znr\" (UniqueName: \"kubernetes.io/projected/ce5b6b87-7965-4f3a-a929-c5495ef9176d-kube-api-access-b2znr\") pod \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\" (UID: \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\") " Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.610830 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-ovsdbserver-nb\") pod \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\" (UID: \"ce5b6b87-7965-4f3a-a929-c5495ef9176d\") " Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.618835 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce5b6b87-7965-4f3a-a929-c5495ef9176d-kube-api-access-b2znr" (OuterVolumeSpecName: "kube-api-access-b2znr") pod "ce5b6b87-7965-4f3a-a929-c5495ef9176d" (UID: "ce5b6b87-7965-4f3a-a929-c5495ef9176d"). InnerVolumeSpecName "kube-api-access-b2znr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.624799 4952 generic.go:334] "Generic (PLEG): container finished" podID="ce5b6b87-7965-4f3a-a929-c5495ef9176d" containerID="5926563b4d2036cb1cf10742c5a9c8782827f4f46fc0fc07e30654425cae110b" exitCode=0 Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.624844 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" event={"ID":"ce5b6b87-7965-4f3a-a929-c5495ef9176d","Type":"ContainerDied","Data":"5926563b4d2036cb1cf10742c5a9c8782827f4f46fc0fc07e30654425cae110b"} Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.624873 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" event={"ID":"ce5b6b87-7965-4f3a-a929-c5495ef9176d","Type":"ContainerDied","Data":"41c91b7bf029e2f46f66e0b5ee84c862da7d71f1d96337b65ec15a24f8844ad0"} Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.624893 4952 scope.go:117] "RemoveContainer" containerID="5926563b4d2036cb1cf10742c5a9c8782827f4f46fc0fc07e30654425cae110b" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.625036 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-pz9jv" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.666238 4952 scope.go:117] "RemoveContainer" containerID="69d90bbc02f8f196aaf373fa7bb2eeb1ee2267869e35b54da8c20595872feae8" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.676481 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce5b6b87-7965-4f3a-a929-c5495ef9176d" (UID: "ce5b6b87-7965-4f3a-a929-c5495ef9176d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.677628 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-config" (OuterVolumeSpecName: "config") pod "ce5b6b87-7965-4f3a-a929-c5495ef9176d" (UID: "ce5b6b87-7965-4f3a-a929-c5495ef9176d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.679242 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce5b6b87-7965-4f3a-a929-c5495ef9176d" (UID: "ce5b6b87-7965-4f3a-a929-c5495ef9176d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.688643 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce5b6b87-7965-4f3a-a929-c5495ef9176d" (UID: "ce5b6b87-7965-4f3a-a929-c5495ef9176d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.714746 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.715112 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2znr\" (UniqueName: \"kubernetes.io/projected/ce5b6b87-7965-4f3a-a929-c5495ef9176d-kube-api-access-b2znr\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.715127 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.715137 4952 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.715146 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce5b6b87-7965-4f3a-a929-c5495ef9176d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.715361 4952 scope.go:117] "RemoveContainer" containerID="5926563b4d2036cb1cf10742c5a9c8782827f4f46fc0fc07e30654425cae110b" Nov 22 03:14:38 crc kubenswrapper[4952]: E1122 03:14:38.718180 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5926563b4d2036cb1cf10742c5a9c8782827f4f46fc0fc07e30654425cae110b\": container with ID starting with 5926563b4d2036cb1cf10742c5a9c8782827f4f46fc0fc07e30654425cae110b not found: ID does not exist" containerID="5926563b4d2036cb1cf10742c5a9c8782827f4f46fc0fc07e30654425cae110b" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.718225 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5926563b4d2036cb1cf10742c5a9c8782827f4f46fc0fc07e30654425cae110b"} err="failed to get container status \"5926563b4d2036cb1cf10742c5a9c8782827f4f46fc0fc07e30654425cae110b\": rpc error: code = NotFound desc = could not find container \"5926563b4d2036cb1cf10742c5a9c8782827f4f46fc0fc07e30654425cae110b\": container with ID starting with 5926563b4d2036cb1cf10742c5a9c8782827f4f46fc0fc07e30654425cae110b not found: ID does not exist" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.718256 4952 scope.go:117] "RemoveContainer" containerID="69d90bbc02f8f196aaf373fa7bb2eeb1ee2267869e35b54da8c20595872feae8" Nov 22 03:14:38 crc kubenswrapper[4952]: E1122 03:14:38.719248 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69d90bbc02f8f196aaf373fa7bb2eeb1ee2267869e35b54da8c20595872feae8\": container with ID starting with 69d90bbc02f8f196aaf373fa7bb2eeb1ee2267869e35b54da8c20595872feae8 not found: ID does not exist" containerID="69d90bbc02f8f196aaf373fa7bb2eeb1ee2267869e35b54da8c20595872feae8" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.719335 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69d90bbc02f8f196aaf373fa7bb2eeb1ee2267869e35b54da8c20595872feae8"} err="failed to get container status \"69d90bbc02f8f196aaf373fa7bb2eeb1ee2267869e35b54da8c20595872feae8\": rpc error: code = NotFound desc = could not find container \"69d90bbc02f8f196aaf373fa7bb2eeb1ee2267869e35b54da8c20595872feae8\": container with ID starting with 69d90bbc02f8f196aaf373fa7bb2eeb1ee2267869e35b54da8c20595872feae8 not found: ID does not exist" Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.882694 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-sxwx6"] Nov 22 03:14:38 crc kubenswrapper[4952]: W1122 03:14:38.887077 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf162dc08_bc15_4ec6_b9e7_857bcdfa0dff.slice/crio-a04f591f581b8a75cc570586b876ea0865c4a368fbd63491cb7fdcfa4e46408d WatchSource:0}: Error finding container a04f591f581b8a75cc570586b876ea0865c4a368fbd63491cb7fdcfa4e46408d: Status 404 returned error can't find the container with id a04f591f581b8a75cc570586b876ea0865c4a368fbd63491cb7fdcfa4e46408d Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.972842 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-pz9jv"] Nov 22 03:14:38 crc kubenswrapper[4952]: I1122 03:14:38.983459 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-pz9jv"] Nov 22 03:14:39 crc kubenswrapper[4952]: I1122 03:14:39.641314 4952 generic.go:334] "Generic (PLEG): container finished" podID="f162dc08-bc15-4ec6-b9e7-857bcdfa0dff" containerID="f6fec01a09c157266c62d697df5140359a4d04bfac527f5200614f09f7ca37f2" exitCode=0 Nov 22 03:14:39 crc kubenswrapper[4952]: I1122 03:14:39.641416 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" event={"ID":"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff","Type":"ContainerDied","Data":"f6fec01a09c157266c62d697df5140359a4d04bfac527f5200614f09f7ca37f2"} Nov 22 03:14:39 crc kubenswrapper[4952]: I1122 03:14:39.642351 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" event={"ID":"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff","Type":"ContainerStarted","Data":"a04f591f581b8a75cc570586b876ea0865c4a368fbd63491cb7fdcfa4e46408d"} Nov 22 03:14:40 crc kubenswrapper[4952]: I1122 03:14:40.554748 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce5b6b87-7965-4f3a-a929-c5495ef9176d" path="/var/lib/kubelet/pods/ce5b6b87-7965-4f3a-a929-c5495ef9176d/volumes" Nov 22 03:14:40 crc kubenswrapper[4952]: I1122 03:14:40.668145 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" event={"ID":"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff","Type":"ContainerStarted","Data":"845c1f6384512c20d715342d78f2350b48ffadc195ad0e2f696d985321b7a12e"} Nov 22 03:14:40 crc kubenswrapper[4952]: I1122 03:14:40.668366 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:40 crc kubenswrapper[4952]: I1122 03:14:40.716778 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" podStartSLOduration=2.716743374 podStartE2EDuration="2.716743374s" podCreationTimestamp="2025-11-22 03:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:14:40.703376549 +0000 UTC m=+1245.009393892" watchObservedRunningTime="2025-11-22 03:14:40.716743374 +0000 UTC m=+1245.022760697" Nov 22 03:14:48 crc kubenswrapper[4952]: I1122 03:14:48.395753 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:14:48 crc kubenswrapper[4952]: I1122 03:14:48.499755 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-nj89b"] Nov 22 03:14:48 crc kubenswrapper[4952]: I1122 03:14:48.500073 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" podUID="a3c2756c-4a3d-4ea3-9b2f-76774ea195a9" containerName="dnsmasq-dns" containerID="cri-o://0c47826f64a2845bfc4f9be9185c5c3ea688f1ef05f6146a05f9ad4b4f81a521" gracePeriod=10 Nov 22 03:14:48 crc kubenswrapper[4952]: I1122 03:14:48.927674 4952 generic.go:334] "Generic (PLEG): container finished" podID="a3c2756c-4a3d-4ea3-9b2f-76774ea195a9" containerID="0c47826f64a2845bfc4f9be9185c5c3ea688f1ef05f6146a05f9ad4b4f81a521" exitCode=0 Nov 22 03:14:48 crc kubenswrapper[4952]: I1122 03:14:48.928105 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" event={"ID":"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9","Type":"ContainerDied","Data":"0c47826f64a2845bfc4f9be9185c5c3ea688f1ef05f6146a05f9ad4b4f81a521"} Nov 22 03:14:48 crc kubenswrapper[4952]: I1122 03:14:48.928153 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" event={"ID":"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9","Type":"ContainerDied","Data":"59d362ff3e4161a671451dd23783e0ee7929663a384ebd60d4afad29bcf821c7"} Nov 22 03:14:48 crc kubenswrapper[4952]: I1122 03:14:48.928166 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59d362ff3e4161a671451dd23783e0ee7929663a384ebd60d4afad29bcf821c7" Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.006699 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.070126 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-dns-svc\") pod \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.070200 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-openstack-edpm-ipam\") pod \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.070237 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-config\") pod \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.070319 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-ovsdbserver-sb\") pod \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.070514 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpljc\" (UniqueName: \"kubernetes.io/projected/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-kube-api-access-qpljc\") pod \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.070593 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-ovsdbserver-nb\") pod \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\" (UID: \"a3c2756c-4a3d-4ea3-9b2f-76774ea195a9\") " Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.086574 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-kube-api-access-qpljc" (OuterVolumeSpecName: "kube-api-access-qpljc") pod "a3c2756c-4a3d-4ea3-9b2f-76774ea195a9" (UID: "a3c2756c-4a3d-4ea3-9b2f-76774ea195a9"). InnerVolumeSpecName "kube-api-access-qpljc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.118401 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-config" (OuterVolumeSpecName: "config") pod "a3c2756c-4a3d-4ea3-9b2f-76774ea195a9" (UID: "a3c2756c-4a3d-4ea3-9b2f-76774ea195a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.124563 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a3c2756c-4a3d-4ea3-9b2f-76774ea195a9" (UID: "a3c2756c-4a3d-4ea3-9b2f-76774ea195a9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.136916 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a3c2756c-4a3d-4ea3-9b2f-76774ea195a9" (UID: "a3c2756c-4a3d-4ea3-9b2f-76774ea195a9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.146401 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a3c2756c-4a3d-4ea3-9b2f-76774ea195a9" (UID: "a3c2756c-4a3d-4ea3-9b2f-76774ea195a9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.152756 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a3c2756c-4a3d-4ea3-9b2f-76774ea195a9" (UID: "a3c2756c-4a3d-4ea3-9b2f-76774ea195a9"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.173457 4952 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.173504 4952 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.173521 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.173557 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.173571 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpljc\" (UniqueName: \"kubernetes.io/projected/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-kube-api-access-qpljc\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.173583 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.939453 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-nj89b" Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.987772 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-nj89b"] Nov 22 03:14:49 crc kubenswrapper[4952]: I1122 03:14:49.997493 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-nj89b"] Nov 22 03:14:50 crc kubenswrapper[4952]: I1122 03:14:50.543795 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3c2756c-4a3d-4ea3-9b2f-76774ea195a9" path="/var/lib/kubelet/pods/a3c2756c-4a3d-4ea3-9b2f-76774ea195a9/volumes" Nov 22 03:14:58 crc kubenswrapper[4952]: I1122 03:14:58.341422 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:14:58 crc kubenswrapper[4952]: I1122 03:14:58.342819 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:14:58 crc kubenswrapper[4952]: I1122 03:14:58.821840 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76"] Nov 22 03:14:58 crc kubenswrapper[4952]: E1122 03:14:58.823064 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce5b6b87-7965-4f3a-a929-c5495ef9176d" containerName="dnsmasq-dns" Nov 22 03:14:58 crc kubenswrapper[4952]: I1122 03:14:58.823167 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce5b6b87-7965-4f3a-a929-c5495ef9176d" containerName="dnsmasq-dns" Nov 22 03:14:58 crc kubenswrapper[4952]: E1122 03:14:58.823267 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce5b6b87-7965-4f3a-a929-c5495ef9176d" containerName="init" Nov 22 03:14:58 crc kubenswrapper[4952]: I1122 03:14:58.823344 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce5b6b87-7965-4f3a-a929-c5495ef9176d" containerName="init" Nov 22 03:14:58 crc kubenswrapper[4952]: E1122 03:14:58.823427 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c2756c-4a3d-4ea3-9b2f-76774ea195a9" containerName="init" Nov 22 03:14:58 crc kubenswrapper[4952]: I1122 03:14:58.823509 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c2756c-4a3d-4ea3-9b2f-76774ea195a9" containerName="init" Nov 22 03:14:58 crc kubenswrapper[4952]: E1122 03:14:58.823631 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c2756c-4a3d-4ea3-9b2f-76774ea195a9" containerName="dnsmasq-dns" Nov 22 03:14:58 crc kubenswrapper[4952]: I1122 03:14:58.823709 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c2756c-4a3d-4ea3-9b2f-76774ea195a9" containerName="dnsmasq-dns" Nov 22 03:14:58 crc kubenswrapper[4952]: I1122 03:14:58.824029 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce5b6b87-7965-4f3a-a929-c5495ef9176d" containerName="dnsmasq-dns" Nov 22 03:14:58 crc kubenswrapper[4952]: I1122 03:14:58.824126 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c2756c-4a3d-4ea3-9b2f-76774ea195a9" containerName="dnsmasq-dns" Nov 22 03:14:58 crc kubenswrapper[4952]: I1122 03:14:58.825306 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" Nov 22 03:14:58 crc kubenswrapper[4952]: I1122 03:14:58.828317 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:14:58 crc kubenswrapper[4952]: I1122 03:14:58.828787 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:14:58 crc kubenswrapper[4952]: I1122 03:14:58.829090 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:14:58 crc kubenswrapper[4952]: I1122 03:14:58.835708 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:14:58 crc kubenswrapper[4952]: I1122 03:14:58.851642 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76"] Nov 22 03:14:58 crc kubenswrapper[4952]: I1122 03:14:58.928142 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76\" (UID: \"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" Nov 22 03:14:58 crc kubenswrapper[4952]: I1122 03:14:58.931401 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76\" (UID: \"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" Nov 22 03:14:58 crc kubenswrapper[4952]: I1122 03:14:58.931791 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76\" (UID: \"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" Nov 22 03:14:58 crc kubenswrapper[4952]: I1122 03:14:58.931974 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqxcx\" (UniqueName: \"kubernetes.io/projected/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-kube-api-access-dqxcx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76\" (UID: \"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" Nov 22 03:14:59 crc kubenswrapper[4952]: I1122 03:14:59.035947 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76\" (UID: \"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" Nov 22 03:14:59 crc kubenswrapper[4952]: I1122 03:14:59.036064 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76\" (UID: \"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" Nov 22 03:14:59 crc kubenswrapper[4952]: I1122 03:14:59.036120 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76\" (UID: \"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" Nov 22 03:14:59 crc kubenswrapper[4952]: I1122 03:14:59.036159 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqxcx\" (UniqueName: \"kubernetes.io/projected/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-kube-api-access-dqxcx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76\" (UID: \"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" Nov 22 03:14:59 crc kubenswrapper[4952]: I1122 03:14:59.047478 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76\" (UID: \"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" Nov 22 03:14:59 crc kubenswrapper[4952]: I1122 03:14:59.049883 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76\" (UID: \"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" Nov 22 03:14:59 crc kubenswrapper[4952]: I1122 03:14:59.061501 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76\" (UID: \"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" Nov 22 03:14:59 crc kubenswrapper[4952]: I1122 03:14:59.063920 4952 generic.go:334] "Generic (PLEG): container finished" podID="f71324dd-d6ac-457c-83be-541d1afa5ec4" containerID="04dad3fe46823507d65c5bbcba518353cfe64bdcbc3e3eabfe056b3e76686753" exitCode=0 Nov 22 03:14:59 crc kubenswrapper[4952]: I1122 03:14:59.063960 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f71324dd-d6ac-457c-83be-541d1afa5ec4","Type":"ContainerDied","Data":"04dad3fe46823507d65c5bbcba518353cfe64bdcbc3e3eabfe056b3e76686753"} Nov 22 03:14:59 crc kubenswrapper[4952]: I1122 03:14:59.070272 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqxcx\" (UniqueName: \"kubernetes.io/projected/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-kube-api-access-dqxcx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76\" (UID: \"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" Nov 22 03:14:59 crc kubenswrapper[4952]: I1122 03:14:59.154872 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" Nov 22 03:14:59 crc kubenswrapper[4952]: I1122 03:14:59.793374 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76"] Nov 22 03:14:59 crc kubenswrapper[4952]: W1122 03:14:59.805393 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a5a968b_0fa9_4656_bd4f_8d84a160ce8e.slice/crio-cf1fc55e726512870045cb4157cac2797e9cdffc62f1bf2e2894f93aa4f6d0b2 WatchSource:0}: Error finding container cf1fc55e726512870045cb4157cac2797e9cdffc62f1bf2e2894f93aa4f6d0b2: Status 404 returned error can't find the container with id cf1fc55e726512870045cb4157cac2797e9cdffc62f1bf2e2894f93aa4f6d0b2 Nov 22 03:14:59 crc kubenswrapper[4952]: I1122 03:14:59.807324 4952 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.079091 4952 generic.go:334] "Generic (PLEG): container finished" podID="8f72c2a8-8441-4469-a7aa-d87b27a7dd6a" containerID="f3adc2c91ad0274447f6a6f1329f90f4673e629ec6233bc2fffcd4a78c76d8f0" exitCode=0 Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.079180 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a","Type":"ContainerDied","Data":"f3adc2c91ad0274447f6a6f1329f90f4673e629ec6233bc2fffcd4a78c76d8f0"} Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.080924 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" event={"ID":"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e","Type":"ContainerStarted","Data":"cf1fc55e726512870045cb4157cac2797e9cdffc62f1bf2e2894f93aa4f6d0b2"} Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.083788 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f71324dd-d6ac-457c-83be-541d1afa5ec4","Type":"ContainerStarted","Data":"ef6680c20269012600957550123810e4c65b6b7eff39559dce6414718e5a74fe"} Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.084066 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.153921 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.153901268 podStartE2EDuration="37.153901268s" podCreationTimestamp="2025-11-22 03:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:15:00.14490559 +0000 UTC m=+1264.450922873" watchObservedRunningTime="2025-11-22 03:15:00.153901268 +0000 UTC m=+1264.459918541" Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.204722 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6"] Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.206627 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6" Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.210783 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.211043 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.216451 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6"] Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.279231 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ab7d465-e283-4f05-aa69-8ba55c10a609-secret-volume\") pod \"collect-profiles-29396355-9qxl6\" (UID: \"2ab7d465-e283-4f05-aa69-8ba55c10a609\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6" Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.279319 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ab7d465-e283-4f05-aa69-8ba55c10a609-config-volume\") pod \"collect-profiles-29396355-9qxl6\" (UID: \"2ab7d465-e283-4f05-aa69-8ba55c10a609\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6" Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.279388 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n876z\" (UniqueName: \"kubernetes.io/projected/2ab7d465-e283-4f05-aa69-8ba55c10a609-kube-api-access-n876z\") pod \"collect-profiles-29396355-9qxl6\" (UID: \"2ab7d465-e283-4f05-aa69-8ba55c10a609\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6" Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.381104 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ab7d465-e283-4f05-aa69-8ba55c10a609-config-volume\") pod \"collect-profiles-29396355-9qxl6\" (UID: \"2ab7d465-e283-4f05-aa69-8ba55c10a609\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6" Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.381157 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n876z\" (UniqueName: \"kubernetes.io/projected/2ab7d465-e283-4f05-aa69-8ba55c10a609-kube-api-access-n876z\") pod \"collect-profiles-29396355-9qxl6\" (UID: \"2ab7d465-e283-4f05-aa69-8ba55c10a609\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6" Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.381302 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ab7d465-e283-4f05-aa69-8ba55c10a609-secret-volume\") pod \"collect-profiles-29396355-9qxl6\" (UID: \"2ab7d465-e283-4f05-aa69-8ba55c10a609\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6" Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.382416 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ab7d465-e283-4f05-aa69-8ba55c10a609-config-volume\") pod \"collect-profiles-29396355-9qxl6\" (UID: \"2ab7d465-e283-4f05-aa69-8ba55c10a609\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6" Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.398009 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ab7d465-e283-4f05-aa69-8ba55c10a609-secret-volume\") pod \"collect-profiles-29396355-9qxl6\" (UID: \"2ab7d465-e283-4f05-aa69-8ba55c10a609\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6" Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.407488 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n876z\" (UniqueName: \"kubernetes.io/projected/2ab7d465-e283-4f05-aa69-8ba55c10a609-kube-api-access-n876z\") pod \"collect-profiles-29396355-9qxl6\" (UID: \"2ab7d465-e283-4f05-aa69-8ba55c10a609\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6" Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.592466 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6" Nov 22 03:15:00 crc kubenswrapper[4952]: I1122 03:15:00.891910 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6"] Nov 22 03:15:01 crc kubenswrapper[4952]: I1122 03:15:01.103202 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6" event={"ID":"2ab7d465-e283-4f05-aa69-8ba55c10a609","Type":"ContainerStarted","Data":"ee48ed3c49323abb80c272cfa1c21945362a2efc97c721da7de93a59886f4234"} Nov 22 03:15:01 crc kubenswrapper[4952]: I1122 03:15:01.103246 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6" event={"ID":"2ab7d465-e283-4f05-aa69-8ba55c10a609","Type":"ContainerStarted","Data":"4f57b411897fbe115ca6095f505a9cb69dbe859d085778c69c6944d5098f2337"} Nov 22 03:15:01 crc kubenswrapper[4952]: I1122 03:15:01.108792 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8f72c2a8-8441-4469-a7aa-d87b27a7dd6a","Type":"ContainerStarted","Data":"2039d50e2026d8fc309c44ab1b75d874341b9ee08902b5c994ada96f7b20a390"} Nov 22 03:15:01 crc kubenswrapper[4952]: I1122 03:15:01.109524 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:15:01 crc kubenswrapper[4952]: I1122 03:15:01.122345 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6" podStartSLOduration=1.122325175 podStartE2EDuration="1.122325175s" podCreationTimestamp="2025-11-22 03:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:15:01.120696972 +0000 UTC m=+1265.426714245" watchObservedRunningTime="2025-11-22 03:15:01.122325175 +0000 UTC m=+1265.428342458" Nov 22 03:15:01 crc kubenswrapper[4952]: I1122 03:15:01.156525 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.156503071 podStartE2EDuration="37.156503071s" podCreationTimestamp="2025-11-22 03:14:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:15:01.147774189 +0000 UTC m=+1265.453791462" watchObservedRunningTime="2025-11-22 03:15:01.156503071 +0000 UTC m=+1265.462520344" Nov 22 03:15:02 crc kubenswrapper[4952]: I1122 03:15:02.121196 4952 generic.go:334] "Generic (PLEG): container finished" podID="2ab7d465-e283-4f05-aa69-8ba55c10a609" containerID="ee48ed3c49323abb80c272cfa1c21945362a2efc97c721da7de93a59886f4234" exitCode=0 Nov 22 03:15:02 crc kubenswrapper[4952]: I1122 03:15:02.121279 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6" event={"ID":"2ab7d465-e283-4f05-aa69-8ba55c10a609","Type":"ContainerDied","Data":"ee48ed3c49323abb80c272cfa1c21945362a2efc97c721da7de93a59886f4234"} Nov 22 03:15:03 crc kubenswrapper[4952]: I1122 03:15:03.837702 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6" Nov 22 03:15:03 crc kubenswrapper[4952]: I1122 03:15:03.866017 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ab7d465-e283-4f05-aa69-8ba55c10a609-secret-volume\") pod \"2ab7d465-e283-4f05-aa69-8ba55c10a609\" (UID: \"2ab7d465-e283-4f05-aa69-8ba55c10a609\") " Nov 22 03:15:03 crc kubenswrapper[4952]: I1122 03:15:03.866181 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ab7d465-e283-4f05-aa69-8ba55c10a609-config-volume\") pod \"2ab7d465-e283-4f05-aa69-8ba55c10a609\" (UID: \"2ab7d465-e283-4f05-aa69-8ba55c10a609\") " Nov 22 03:15:03 crc kubenswrapper[4952]: I1122 03:15:03.866320 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n876z\" (UniqueName: \"kubernetes.io/projected/2ab7d465-e283-4f05-aa69-8ba55c10a609-kube-api-access-n876z\") pod \"2ab7d465-e283-4f05-aa69-8ba55c10a609\" (UID: \"2ab7d465-e283-4f05-aa69-8ba55c10a609\") " Nov 22 03:15:03 crc kubenswrapper[4952]: I1122 03:15:03.867122 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ab7d465-e283-4f05-aa69-8ba55c10a609-config-volume" (OuterVolumeSpecName: "config-volume") pod "2ab7d465-e283-4f05-aa69-8ba55c10a609" (UID: "2ab7d465-e283-4f05-aa69-8ba55c10a609"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:15:03 crc kubenswrapper[4952]: I1122 03:15:03.869498 4952 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ab7d465-e283-4f05-aa69-8ba55c10a609-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 03:15:03 crc kubenswrapper[4952]: I1122 03:15:03.875196 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab7d465-e283-4f05-aa69-8ba55c10a609-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2ab7d465-e283-4f05-aa69-8ba55c10a609" (UID: "2ab7d465-e283-4f05-aa69-8ba55c10a609"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:15:03 crc kubenswrapper[4952]: I1122 03:15:03.876120 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab7d465-e283-4f05-aa69-8ba55c10a609-kube-api-access-n876z" (OuterVolumeSpecName: "kube-api-access-n876z") pod "2ab7d465-e283-4f05-aa69-8ba55c10a609" (UID: "2ab7d465-e283-4f05-aa69-8ba55c10a609"). InnerVolumeSpecName "kube-api-access-n876z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:15:03 crc kubenswrapper[4952]: I1122 03:15:03.974007 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n876z\" (UniqueName: \"kubernetes.io/projected/2ab7d465-e283-4f05-aa69-8ba55c10a609-kube-api-access-n876z\") on node \"crc\" DevicePath \"\"" Nov 22 03:15:03 crc kubenswrapper[4952]: I1122 03:15:03.974046 4952 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ab7d465-e283-4f05-aa69-8ba55c10a609-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 03:15:04 crc kubenswrapper[4952]: I1122 03:15:04.145567 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6" event={"ID":"2ab7d465-e283-4f05-aa69-8ba55c10a609","Type":"ContainerDied","Data":"4f57b411897fbe115ca6095f505a9cb69dbe859d085778c69c6944d5098f2337"} Nov 22 03:15:04 crc kubenswrapper[4952]: I1122 03:15:04.145614 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f57b411897fbe115ca6095f505a9cb69dbe859d085778c69c6944d5098f2337" Nov 22 03:15:04 crc kubenswrapper[4952]: I1122 03:15:04.145674 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6" Nov 22 03:15:11 crc kubenswrapper[4952]: I1122 03:15:11.236151 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" event={"ID":"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e","Type":"ContainerStarted","Data":"3f53ac2e02db381cd7313d67028bb1a1c324b6969d874dd1b637249d76deae76"} Nov 22 03:15:11 crc kubenswrapper[4952]: I1122 03:15:11.278942 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" podStartSLOduration=2.568506445 podStartE2EDuration="13.278916244s" podCreationTimestamp="2025-11-22 03:14:58 +0000 UTC" firstStartedPulling="2025-11-22 03:14:59.807045968 +0000 UTC m=+1264.113063251" lastFinishedPulling="2025-11-22 03:15:10.517455737 +0000 UTC m=+1274.823473050" observedRunningTime="2025-11-22 03:15:11.273642574 +0000 UTC m=+1275.579659907" watchObservedRunningTime="2025-11-22 03:15:11.278916244 +0000 UTC m=+1275.584933557" Nov 22 03:15:13 crc kubenswrapper[4952]: I1122 03:15:13.939848 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 22 03:15:14 crc kubenswrapper[4952]: I1122 03:15:14.835125 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:15:23 crc kubenswrapper[4952]: I1122 03:15:23.374970 4952 generic.go:334] "Generic (PLEG): container finished" podID="1a5a968b-0fa9-4656-bd4f-8d84a160ce8e" containerID="3f53ac2e02db381cd7313d67028bb1a1c324b6969d874dd1b637249d76deae76" exitCode=0 Nov 22 03:15:23 crc kubenswrapper[4952]: I1122 03:15:23.375652 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" event={"ID":"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e","Type":"ContainerDied","Data":"3f53ac2e02db381cd7313d67028bb1a1c324b6969d874dd1b637249d76deae76"} Nov 22 03:15:24 crc kubenswrapper[4952]: I1122 03:15:24.905159 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" Nov 22 03:15:24 crc kubenswrapper[4952]: I1122 03:15:24.959423 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-repo-setup-combined-ca-bundle\") pod \"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e\" (UID: \"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e\") " Nov 22 03:15:24 crc kubenswrapper[4952]: I1122 03:15:24.959481 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-inventory\") pod \"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e\" (UID: \"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e\") " Nov 22 03:15:24 crc kubenswrapper[4952]: I1122 03:15:24.959638 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-ssh-key\") pod \"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e\" (UID: \"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e\") " Nov 22 03:15:24 crc kubenswrapper[4952]: I1122 03:15:24.959734 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqxcx\" (UniqueName: \"kubernetes.io/projected/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-kube-api-access-dqxcx\") pod \"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e\" (UID: \"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e\") " Nov 22 03:15:24 crc kubenswrapper[4952]: I1122 03:15:24.969230 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-kube-api-access-dqxcx" (OuterVolumeSpecName: "kube-api-access-dqxcx") pod "1a5a968b-0fa9-4656-bd4f-8d84a160ce8e" (UID: "1a5a968b-0fa9-4656-bd4f-8d84a160ce8e"). InnerVolumeSpecName "kube-api-access-dqxcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:15:24 crc kubenswrapper[4952]: I1122 03:15:24.970059 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "1a5a968b-0fa9-4656-bd4f-8d84a160ce8e" (UID: "1a5a968b-0fa9-4656-bd4f-8d84a160ce8e"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.001136 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1a5a968b-0fa9-4656-bd4f-8d84a160ce8e" (UID: "1a5a968b-0fa9-4656-bd4f-8d84a160ce8e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.002728 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-inventory" (OuterVolumeSpecName: "inventory") pod "1a5a968b-0fa9-4656-bd4f-8d84a160ce8e" (UID: "1a5a968b-0fa9-4656-bd4f-8d84a160ce8e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.061504 4952 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.061561 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.061574 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.061583 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqxcx\" (UniqueName: \"kubernetes.io/projected/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e-kube-api-access-dqxcx\") on node \"crc\" DevicePath \"\"" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.416302 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" event={"ID":"1a5a968b-0fa9-4656-bd4f-8d84a160ce8e","Type":"ContainerDied","Data":"cf1fc55e726512870045cb4157cac2797e9cdffc62f1bf2e2894f93aa4f6d0b2"} Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.416401 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf1fc55e726512870045cb4157cac2797e9cdffc62f1bf2e2894f93aa4f6d0b2" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.416691 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.518023 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc"] Nov 22 03:15:25 crc kubenswrapper[4952]: E1122 03:15:25.518742 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a5a968b-0fa9-4656-bd4f-8d84a160ce8e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.518772 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a5a968b-0fa9-4656-bd4f-8d84a160ce8e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 22 03:15:25 crc kubenswrapper[4952]: E1122 03:15:25.518821 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab7d465-e283-4f05-aa69-8ba55c10a609" containerName="collect-profiles" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.518838 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab7d465-e283-4f05-aa69-8ba55c10a609" containerName="collect-profiles" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.519190 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab7d465-e283-4f05-aa69-8ba55c10a609" containerName="collect-profiles" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.519231 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a5a968b-0fa9-4656-bd4f-8d84a160ce8e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.520355 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.525326 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.525683 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.526033 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.526484 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.527724 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc"] Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.572974 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/562bf029-85bf-47a6-b4a7-913eb130f85b-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc\" (UID: \"562bf029-85bf-47a6-b4a7-913eb130f85b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.573228 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkthp\" (UniqueName: \"kubernetes.io/projected/562bf029-85bf-47a6-b4a7-913eb130f85b-kube-api-access-zkthp\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc\" (UID: \"562bf029-85bf-47a6-b4a7-913eb130f85b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.573498 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/562bf029-85bf-47a6-b4a7-913eb130f85b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc\" (UID: \"562bf029-85bf-47a6-b4a7-913eb130f85b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.573788 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/562bf029-85bf-47a6-b4a7-913eb130f85b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc\" (UID: \"562bf029-85bf-47a6-b4a7-913eb130f85b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.675993 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/562bf029-85bf-47a6-b4a7-913eb130f85b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc\" (UID: \"562bf029-85bf-47a6-b4a7-913eb130f85b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.676196 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/562bf029-85bf-47a6-b4a7-913eb130f85b-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc\" (UID: \"562bf029-85bf-47a6-b4a7-913eb130f85b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.676319 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkthp\" (UniqueName: \"kubernetes.io/projected/562bf029-85bf-47a6-b4a7-913eb130f85b-kube-api-access-zkthp\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc\" (UID: \"562bf029-85bf-47a6-b4a7-913eb130f85b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.676423 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/562bf029-85bf-47a6-b4a7-913eb130f85b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc\" (UID: \"562bf029-85bf-47a6-b4a7-913eb130f85b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.683452 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/562bf029-85bf-47a6-b4a7-913eb130f85b-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc\" (UID: \"562bf029-85bf-47a6-b4a7-913eb130f85b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.683519 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/562bf029-85bf-47a6-b4a7-913eb130f85b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc\" (UID: \"562bf029-85bf-47a6-b4a7-913eb130f85b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.684011 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/562bf029-85bf-47a6-b4a7-913eb130f85b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc\" (UID: \"562bf029-85bf-47a6-b4a7-913eb130f85b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.708811 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkthp\" (UniqueName: \"kubernetes.io/projected/562bf029-85bf-47a6-b4a7-913eb130f85b-kube-api-access-zkthp\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc\" (UID: \"562bf029-85bf-47a6-b4a7-913eb130f85b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" Nov 22 03:15:25 crc kubenswrapper[4952]: I1122 03:15:25.845411 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" Nov 22 03:15:26 crc kubenswrapper[4952]: I1122 03:15:26.509351 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc"] Nov 22 03:15:27 crc kubenswrapper[4952]: I1122 03:15:27.444635 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" event={"ID":"562bf029-85bf-47a6-b4a7-913eb130f85b","Type":"ContainerStarted","Data":"fce6a3395361c8740b39d4467840427834d826c7b1fb32c5f5321af038786ec7"} Nov 22 03:15:27 crc kubenswrapper[4952]: I1122 03:15:27.445279 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" event={"ID":"562bf029-85bf-47a6-b4a7-913eb130f85b","Type":"ContainerStarted","Data":"6a76c613ae6b124544c444676da0c827615033df3d0adc69d094b823c84380f2"} Nov 22 03:15:27 crc kubenswrapper[4952]: I1122 03:15:27.465737 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" podStartSLOduration=1.935054292 podStartE2EDuration="2.465715867s" podCreationTimestamp="2025-11-22 03:15:25 +0000 UTC" firstStartedPulling="2025-11-22 03:15:26.516844959 +0000 UTC m=+1290.822862242" lastFinishedPulling="2025-11-22 03:15:27.047506504 +0000 UTC m=+1291.353523817" observedRunningTime="2025-11-22 03:15:27.461296299 +0000 UTC m=+1291.767313612" watchObservedRunningTime="2025-11-22 03:15:27.465715867 +0000 UTC m=+1291.771733140" Nov 22 03:15:28 crc kubenswrapper[4952]: I1122 03:15:28.342427 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:15:28 crc kubenswrapper[4952]: I1122 03:15:28.344471 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:15:58 crc kubenswrapper[4952]: I1122 03:15:58.300398 4952 scope.go:117] "RemoveContainer" containerID="80f01f51b674de019efd2b62df4203846c258323d85fe6d06ca876d3111deee3" Nov 22 03:15:58 crc kubenswrapper[4952]: I1122 03:15:58.341622 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:15:58 crc kubenswrapper[4952]: I1122 03:15:58.341815 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:15:58 crc kubenswrapper[4952]: I1122 03:15:58.341940 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 03:15:58 crc kubenswrapper[4952]: I1122 03:15:58.343060 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"15846c38005a8395c19c26d63eb9f008cd0288cc544d3ca54c338b089d4cf1e5"} pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:15:58 crc kubenswrapper[4952]: I1122 03:15:58.343215 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" containerID="cri-o://15846c38005a8395c19c26d63eb9f008cd0288cc544d3ca54c338b089d4cf1e5" gracePeriod=600 Nov 22 03:15:58 crc kubenswrapper[4952]: I1122 03:15:58.815208 4952 generic.go:334] "Generic (PLEG): container finished" podID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerID="15846c38005a8395c19c26d63eb9f008cd0288cc544d3ca54c338b089d4cf1e5" exitCode=0 Nov 22 03:15:58 crc kubenswrapper[4952]: I1122 03:15:58.815301 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerDied","Data":"15846c38005a8395c19c26d63eb9f008cd0288cc544d3ca54c338b089d4cf1e5"} Nov 22 03:15:58 crc kubenswrapper[4952]: I1122 03:15:58.815673 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861"} Nov 22 03:15:58 crc kubenswrapper[4952]: I1122 03:15:58.815699 4952 scope.go:117] "RemoveContainer" containerID="f9d8e3cfbebc6d3bc61b04b622504062503fa5b2938cf86cbe1187a9e089f5b5" Nov 22 03:16:58 crc kubenswrapper[4952]: I1122 03:16:58.399945 4952 scope.go:117] "RemoveContainer" containerID="c5e0d1a64bde42e0b6c42e227a939d9e166c659c5146d1fcf9c1c30594d9aaeb" Nov 22 03:17:14 crc kubenswrapper[4952]: I1122 03:17:14.628861 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qrvx5"] Nov 22 03:17:14 crc kubenswrapper[4952]: I1122 03:17:14.633732 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qrvx5" Nov 22 03:17:14 crc kubenswrapper[4952]: I1122 03:17:14.647309 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qrvx5"] Nov 22 03:17:14 crc kubenswrapper[4952]: I1122 03:17:14.778291 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39-catalog-content\") pod \"community-operators-qrvx5\" (UID: \"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39\") " pod="openshift-marketplace/community-operators-qrvx5" Nov 22 03:17:14 crc kubenswrapper[4952]: I1122 03:17:14.778347 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94m9k\" (UniqueName: \"kubernetes.io/projected/ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39-kube-api-access-94m9k\") pod \"community-operators-qrvx5\" (UID: \"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39\") " pod="openshift-marketplace/community-operators-qrvx5" Nov 22 03:17:14 crc kubenswrapper[4952]: I1122 03:17:14.780169 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39-utilities\") pod \"community-operators-qrvx5\" (UID: \"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39\") " pod="openshift-marketplace/community-operators-qrvx5" Nov 22 03:17:14 crc kubenswrapper[4952]: I1122 03:17:14.883187 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39-utilities\") pod \"community-operators-qrvx5\" (UID: \"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39\") " pod="openshift-marketplace/community-operators-qrvx5" Nov 22 03:17:14 crc kubenswrapper[4952]: I1122 03:17:14.883474 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39-catalog-content\") pod \"community-operators-qrvx5\" (UID: \"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39\") " pod="openshift-marketplace/community-operators-qrvx5" Nov 22 03:17:14 crc kubenswrapper[4952]: I1122 03:17:14.883510 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94m9k\" (UniqueName: \"kubernetes.io/projected/ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39-kube-api-access-94m9k\") pod \"community-operators-qrvx5\" (UID: \"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39\") " pod="openshift-marketplace/community-operators-qrvx5" Nov 22 03:17:14 crc kubenswrapper[4952]: I1122 03:17:14.884101 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39-utilities\") pod \"community-operators-qrvx5\" (UID: \"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39\") " pod="openshift-marketplace/community-operators-qrvx5" Nov 22 03:17:14 crc kubenswrapper[4952]: I1122 03:17:14.884168 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39-catalog-content\") pod \"community-operators-qrvx5\" (UID: \"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39\") " pod="openshift-marketplace/community-operators-qrvx5" Nov 22 03:17:14 crc kubenswrapper[4952]: I1122 03:17:14.906660 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94m9k\" (UniqueName: \"kubernetes.io/projected/ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39-kube-api-access-94m9k\") pod \"community-operators-qrvx5\" (UID: \"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39\") " pod="openshift-marketplace/community-operators-qrvx5" Nov 22 03:17:14 crc kubenswrapper[4952]: I1122 03:17:14.962937 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qrvx5" Nov 22 03:17:15 crc kubenswrapper[4952]: I1122 03:17:15.579380 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qrvx5"] Nov 22 03:17:15 crc kubenswrapper[4952]: I1122 03:17:15.810845 4952 generic.go:334] "Generic (PLEG): container finished" podID="ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39" containerID="38ab3b44cb3b526752bf6e02aad73b45d925e191c5aaf0a1f4f2b869e1e75526" exitCode=0 Nov 22 03:17:15 crc kubenswrapper[4952]: I1122 03:17:15.810904 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrvx5" event={"ID":"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39","Type":"ContainerDied","Data":"38ab3b44cb3b526752bf6e02aad73b45d925e191c5aaf0a1f4f2b869e1e75526"} Nov 22 03:17:15 crc kubenswrapper[4952]: I1122 03:17:15.810940 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrvx5" event={"ID":"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39","Type":"ContainerStarted","Data":"22d784d307ff11638a2984904fb6f442a759723493b58b4189f870353cd0a835"} Nov 22 03:17:17 crc kubenswrapper[4952]: I1122 03:17:17.848085 4952 generic.go:334] "Generic (PLEG): container finished" podID="ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39" containerID="b77cbdf00b9d22ac980d94138b8f6955339cd4af7499fa80242ef7f966023f4b" exitCode=0 Nov 22 03:17:17 crc kubenswrapper[4952]: I1122 03:17:17.848205 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrvx5" event={"ID":"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39","Type":"ContainerDied","Data":"b77cbdf00b9d22ac980d94138b8f6955339cd4af7499fa80242ef7f966023f4b"} Nov 22 03:17:18 crc kubenswrapper[4952]: I1122 03:17:18.864058 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrvx5" event={"ID":"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39","Type":"ContainerStarted","Data":"342dd078a31d3998169100fd155ecb1d50a9545482661715d80b787d714d69b3"} Nov 22 03:17:18 crc kubenswrapper[4952]: I1122 03:17:18.898123 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qrvx5" podStartSLOduration=2.370627323 podStartE2EDuration="4.898102838s" podCreationTimestamp="2025-11-22 03:17:14 +0000 UTC" firstStartedPulling="2025-11-22 03:17:15.813509499 +0000 UTC m=+1400.119526772" lastFinishedPulling="2025-11-22 03:17:18.340985004 +0000 UTC m=+1402.647002287" observedRunningTime="2025-11-22 03:17:18.892034477 +0000 UTC m=+1403.198051740" watchObservedRunningTime="2025-11-22 03:17:18.898102838 +0000 UTC m=+1403.204120111" Nov 22 03:17:24 crc kubenswrapper[4952]: I1122 03:17:24.964448 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qrvx5" Nov 22 03:17:24 crc kubenswrapper[4952]: I1122 03:17:24.965421 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qrvx5" Nov 22 03:17:25 crc kubenswrapper[4952]: I1122 03:17:25.050672 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qrvx5" Nov 22 03:17:25 crc kubenswrapper[4952]: I1122 03:17:25.490258 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f9x48"] Nov 22 03:17:25 crc kubenswrapper[4952]: I1122 03:17:25.492991 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f9x48" Nov 22 03:17:25 crc kubenswrapper[4952]: I1122 03:17:25.508716 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f9x48"] Nov 22 03:17:25 crc kubenswrapper[4952]: I1122 03:17:25.655053 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/281501c4-600a-48e4-8392-77d0d20cdab6-catalog-content\") pod \"redhat-marketplace-f9x48\" (UID: \"281501c4-600a-48e4-8392-77d0d20cdab6\") " pod="openshift-marketplace/redhat-marketplace-f9x48" Nov 22 03:17:25 crc kubenswrapper[4952]: I1122 03:17:25.655273 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/281501c4-600a-48e4-8392-77d0d20cdab6-utilities\") pod \"redhat-marketplace-f9x48\" (UID: \"281501c4-600a-48e4-8392-77d0d20cdab6\") " pod="openshift-marketplace/redhat-marketplace-f9x48" Nov 22 03:17:25 crc kubenswrapper[4952]: I1122 03:17:25.655949 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmmtw\" (UniqueName: \"kubernetes.io/projected/281501c4-600a-48e4-8392-77d0d20cdab6-kube-api-access-nmmtw\") pod \"redhat-marketplace-f9x48\" (UID: \"281501c4-600a-48e4-8392-77d0d20cdab6\") " pod="openshift-marketplace/redhat-marketplace-f9x48" Nov 22 03:17:25 crc kubenswrapper[4952]: I1122 03:17:25.757875 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmmtw\" (UniqueName: \"kubernetes.io/projected/281501c4-600a-48e4-8392-77d0d20cdab6-kube-api-access-nmmtw\") pod \"redhat-marketplace-f9x48\" (UID: \"281501c4-600a-48e4-8392-77d0d20cdab6\") " pod="openshift-marketplace/redhat-marketplace-f9x48" Nov 22 03:17:25 crc kubenswrapper[4952]: I1122 03:17:25.758451 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/281501c4-600a-48e4-8392-77d0d20cdab6-catalog-content\") pod \"redhat-marketplace-f9x48\" (UID: \"281501c4-600a-48e4-8392-77d0d20cdab6\") " pod="openshift-marketplace/redhat-marketplace-f9x48" Nov 22 03:17:25 crc kubenswrapper[4952]: I1122 03:17:25.758519 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/281501c4-600a-48e4-8392-77d0d20cdab6-utilities\") pod \"redhat-marketplace-f9x48\" (UID: \"281501c4-600a-48e4-8392-77d0d20cdab6\") " pod="openshift-marketplace/redhat-marketplace-f9x48" Nov 22 03:17:25 crc kubenswrapper[4952]: I1122 03:17:25.759038 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/281501c4-600a-48e4-8392-77d0d20cdab6-utilities\") pod \"redhat-marketplace-f9x48\" (UID: \"281501c4-600a-48e4-8392-77d0d20cdab6\") " pod="openshift-marketplace/redhat-marketplace-f9x48" Nov 22 03:17:25 crc kubenswrapper[4952]: I1122 03:17:25.759125 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/281501c4-600a-48e4-8392-77d0d20cdab6-catalog-content\") pod \"redhat-marketplace-f9x48\" (UID: \"281501c4-600a-48e4-8392-77d0d20cdab6\") " pod="openshift-marketplace/redhat-marketplace-f9x48" Nov 22 03:17:25 crc kubenswrapper[4952]: I1122 03:17:25.780581 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmmtw\" (UniqueName: \"kubernetes.io/projected/281501c4-600a-48e4-8392-77d0d20cdab6-kube-api-access-nmmtw\") pod \"redhat-marketplace-f9x48\" (UID: \"281501c4-600a-48e4-8392-77d0d20cdab6\") " pod="openshift-marketplace/redhat-marketplace-f9x48" Nov 22 03:17:25 crc kubenswrapper[4952]: I1122 03:17:25.831215 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f9x48" Nov 22 03:17:26 crc kubenswrapper[4952]: I1122 03:17:26.005911 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qrvx5" Nov 22 03:17:26 crc kubenswrapper[4952]: I1122 03:17:26.343113 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f9x48"] Nov 22 03:17:26 crc kubenswrapper[4952]: I1122 03:17:26.957308 4952 generic.go:334] "Generic (PLEG): container finished" podID="281501c4-600a-48e4-8392-77d0d20cdab6" containerID="f60321c2e8efa14f2e0a5bbdcc56c8567e2612b4f5ec3c69ffa29648a100ee7f" exitCode=0 Nov 22 03:17:26 crc kubenswrapper[4952]: I1122 03:17:26.957450 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9x48" event={"ID":"281501c4-600a-48e4-8392-77d0d20cdab6","Type":"ContainerDied","Data":"f60321c2e8efa14f2e0a5bbdcc56c8567e2612b4f5ec3c69ffa29648a100ee7f"} Nov 22 03:17:26 crc kubenswrapper[4952]: I1122 03:17:26.958825 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9x48" event={"ID":"281501c4-600a-48e4-8392-77d0d20cdab6","Type":"ContainerStarted","Data":"387f418cfaaa1321b7869e99252e73e44ef7c5a2c5c89b43517d67492c908b66"} Nov 22 03:17:27 crc kubenswrapper[4952]: I1122 03:17:27.972464 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9x48" event={"ID":"281501c4-600a-48e4-8392-77d0d20cdab6","Type":"ContainerStarted","Data":"74c90c335073ba91ca05825df044b3adc081e2a24d32e310a8c9200834dd525b"} Nov 22 03:17:28 crc kubenswrapper[4952]: I1122 03:17:28.291404 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qrvx5"] Nov 22 03:17:28 crc kubenswrapper[4952]: I1122 03:17:28.291695 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qrvx5" podUID="ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39" containerName="registry-server" containerID="cri-o://342dd078a31d3998169100fd155ecb1d50a9545482661715d80b787d714d69b3" gracePeriod=2 Nov 22 03:17:28 crc kubenswrapper[4952]: I1122 03:17:28.740191 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qrvx5" Nov 22 03:17:28 crc kubenswrapper[4952]: I1122 03:17:28.828681 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39-utilities\") pod \"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39\" (UID: \"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39\") " Nov 22 03:17:28 crc kubenswrapper[4952]: I1122 03:17:28.828894 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94m9k\" (UniqueName: \"kubernetes.io/projected/ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39-kube-api-access-94m9k\") pod \"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39\" (UID: \"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39\") " Nov 22 03:17:28 crc kubenswrapper[4952]: I1122 03:17:28.828979 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39-catalog-content\") pod \"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39\" (UID: \"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39\") " Nov 22 03:17:28 crc kubenswrapper[4952]: I1122 03:17:28.829945 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39-utilities" (OuterVolumeSpecName: "utilities") pod "ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39" (UID: "ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:17:28 crc kubenswrapper[4952]: I1122 03:17:28.830191 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:17:28 crc kubenswrapper[4952]: I1122 03:17:28.837817 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39-kube-api-access-94m9k" (OuterVolumeSpecName: "kube-api-access-94m9k") pod "ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39" (UID: "ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39"). InnerVolumeSpecName "kube-api-access-94m9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:17:28 crc kubenswrapper[4952]: I1122 03:17:28.895290 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39" (UID: "ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:17:28 crc kubenswrapper[4952]: I1122 03:17:28.932273 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94m9k\" (UniqueName: \"kubernetes.io/projected/ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39-kube-api-access-94m9k\") on node \"crc\" DevicePath \"\"" Nov 22 03:17:28 crc kubenswrapper[4952]: I1122 03:17:28.932318 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:17:28 crc kubenswrapper[4952]: I1122 03:17:28.987062 4952 generic.go:334] "Generic (PLEG): container finished" podID="281501c4-600a-48e4-8392-77d0d20cdab6" containerID="74c90c335073ba91ca05825df044b3adc081e2a24d32e310a8c9200834dd525b" exitCode=0 Nov 22 03:17:28 crc kubenswrapper[4952]: I1122 03:17:28.987184 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9x48" event={"ID":"281501c4-600a-48e4-8392-77d0d20cdab6","Type":"ContainerDied","Data":"74c90c335073ba91ca05825df044b3adc081e2a24d32e310a8c9200834dd525b"} Nov 22 03:17:28 crc kubenswrapper[4952]: I1122 03:17:28.991457 4952 generic.go:334] "Generic (PLEG): container finished" podID="ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39" containerID="342dd078a31d3998169100fd155ecb1d50a9545482661715d80b787d714d69b3" exitCode=0 Nov 22 03:17:28 crc kubenswrapper[4952]: I1122 03:17:28.991578 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qrvx5" Nov 22 03:17:28 crc kubenswrapper[4952]: I1122 03:17:28.991522 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrvx5" event={"ID":"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39","Type":"ContainerDied","Data":"342dd078a31d3998169100fd155ecb1d50a9545482661715d80b787d714d69b3"} Nov 22 03:17:28 crc kubenswrapper[4952]: I1122 03:17:28.991782 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrvx5" event={"ID":"ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39","Type":"ContainerDied","Data":"22d784d307ff11638a2984904fb6f442a759723493b58b4189f870353cd0a835"} Nov 22 03:17:28 crc kubenswrapper[4952]: I1122 03:17:28.991825 4952 scope.go:117] "RemoveContainer" containerID="342dd078a31d3998169100fd155ecb1d50a9545482661715d80b787d714d69b3" Nov 22 03:17:29 crc kubenswrapper[4952]: I1122 03:17:29.052199 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qrvx5"] Nov 22 03:17:29 crc kubenswrapper[4952]: I1122 03:17:29.052711 4952 scope.go:117] "RemoveContainer" containerID="b77cbdf00b9d22ac980d94138b8f6955339cd4af7499fa80242ef7f966023f4b" Nov 22 03:17:29 crc kubenswrapper[4952]: I1122 03:17:29.064797 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qrvx5"] Nov 22 03:17:29 crc kubenswrapper[4952]: I1122 03:17:29.088445 4952 scope.go:117] "RemoveContainer" containerID="38ab3b44cb3b526752bf6e02aad73b45d925e191c5aaf0a1f4f2b869e1e75526" Nov 22 03:17:29 crc kubenswrapper[4952]: I1122 03:17:29.137052 4952 scope.go:117] "RemoveContainer" containerID="342dd078a31d3998169100fd155ecb1d50a9545482661715d80b787d714d69b3" Nov 22 03:17:29 crc kubenswrapper[4952]: E1122 03:17:29.137760 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"342dd078a31d3998169100fd155ecb1d50a9545482661715d80b787d714d69b3\": container with ID starting with 342dd078a31d3998169100fd155ecb1d50a9545482661715d80b787d714d69b3 not found: ID does not exist" containerID="342dd078a31d3998169100fd155ecb1d50a9545482661715d80b787d714d69b3" Nov 22 03:17:29 crc kubenswrapper[4952]: I1122 03:17:29.137815 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"342dd078a31d3998169100fd155ecb1d50a9545482661715d80b787d714d69b3"} err="failed to get container status \"342dd078a31d3998169100fd155ecb1d50a9545482661715d80b787d714d69b3\": rpc error: code = NotFound desc = could not find container \"342dd078a31d3998169100fd155ecb1d50a9545482661715d80b787d714d69b3\": container with ID starting with 342dd078a31d3998169100fd155ecb1d50a9545482661715d80b787d714d69b3 not found: ID does not exist" Nov 22 03:17:29 crc kubenswrapper[4952]: I1122 03:17:29.137845 4952 scope.go:117] "RemoveContainer" containerID="b77cbdf00b9d22ac980d94138b8f6955339cd4af7499fa80242ef7f966023f4b" Nov 22 03:17:29 crc kubenswrapper[4952]: E1122 03:17:29.138298 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b77cbdf00b9d22ac980d94138b8f6955339cd4af7499fa80242ef7f966023f4b\": container with ID starting with b77cbdf00b9d22ac980d94138b8f6955339cd4af7499fa80242ef7f966023f4b not found: ID does not exist" containerID="b77cbdf00b9d22ac980d94138b8f6955339cd4af7499fa80242ef7f966023f4b" Nov 22 03:17:29 crc kubenswrapper[4952]: I1122 03:17:29.138402 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b77cbdf00b9d22ac980d94138b8f6955339cd4af7499fa80242ef7f966023f4b"} err="failed to get container status \"b77cbdf00b9d22ac980d94138b8f6955339cd4af7499fa80242ef7f966023f4b\": rpc error: code = NotFound desc = could not find container \"b77cbdf00b9d22ac980d94138b8f6955339cd4af7499fa80242ef7f966023f4b\": container with ID starting with b77cbdf00b9d22ac980d94138b8f6955339cd4af7499fa80242ef7f966023f4b not found: ID does not exist" Nov 22 03:17:29 crc kubenswrapper[4952]: I1122 03:17:29.138456 4952 scope.go:117] "RemoveContainer" containerID="38ab3b44cb3b526752bf6e02aad73b45d925e191c5aaf0a1f4f2b869e1e75526" Nov 22 03:17:29 crc kubenswrapper[4952]: E1122 03:17:29.139004 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38ab3b44cb3b526752bf6e02aad73b45d925e191c5aaf0a1f4f2b869e1e75526\": container with ID starting with 38ab3b44cb3b526752bf6e02aad73b45d925e191c5aaf0a1f4f2b869e1e75526 not found: ID does not exist" containerID="38ab3b44cb3b526752bf6e02aad73b45d925e191c5aaf0a1f4f2b869e1e75526" Nov 22 03:17:29 crc kubenswrapper[4952]: I1122 03:17:29.139037 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ab3b44cb3b526752bf6e02aad73b45d925e191c5aaf0a1f4f2b869e1e75526"} err="failed to get container status \"38ab3b44cb3b526752bf6e02aad73b45d925e191c5aaf0a1f4f2b869e1e75526\": rpc error: code = NotFound desc = could not find container \"38ab3b44cb3b526752bf6e02aad73b45d925e191c5aaf0a1f4f2b869e1e75526\": container with ID starting with 38ab3b44cb3b526752bf6e02aad73b45d925e191c5aaf0a1f4f2b869e1e75526 not found: ID does not exist" Nov 22 03:17:30 crc kubenswrapper[4952]: I1122 03:17:30.004435 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9x48" event={"ID":"281501c4-600a-48e4-8392-77d0d20cdab6","Type":"ContainerStarted","Data":"364a9b62bccfa75d556a438c77727bd1f5d51415a737382abf333bf9646d0792"} Nov 22 03:17:30 crc kubenswrapper[4952]: I1122 03:17:30.032020 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f9x48" podStartSLOduration=2.62236579 podStartE2EDuration="5.031996224s" podCreationTimestamp="2025-11-22 03:17:25 +0000 UTC" firstStartedPulling="2025-11-22 03:17:26.960663526 +0000 UTC m=+1411.266680799" lastFinishedPulling="2025-11-22 03:17:29.37029392 +0000 UTC m=+1413.676311233" observedRunningTime="2025-11-22 03:17:30.021904547 +0000 UTC m=+1414.327921830" watchObservedRunningTime="2025-11-22 03:17:30.031996224 +0000 UTC m=+1414.338013507" Nov 22 03:17:30 crc kubenswrapper[4952]: I1122 03:17:30.545719 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39" path="/var/lib/kubelet/pods/ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39/volumes" Nov 22 03:17:35 crc kubenswrapper[4952]: I1122 03:17:35.832243 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f9x48" Nov 22 03:17:35 crc kubenswrapper[4952]: I1122 03:17:35.834603 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f9x48" Nov 22 03:17:35 crc kubenswrapper[4952]: I1122 03:17:35.911336 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f9x48" Nov 22 03:17:36 crc kubenswrapper[4952]: I1122 03:17:36.140684 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f9x48" Nov 22 03:17:36 crc kubenswrapper[4952]: I1122 03:17:36.212299 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f9x48"] Nov 22 03:17:38 crc kubenswrapper[4952]: I1122 03:17:38.120530 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f9x48" podUID="281501c4-600a-48e4-8392-77d0d20cdab6" containerName="registry-server" containerID="cri-o://364a9b62bccfa75d556a438c77727bd1f5d51415a737382abf333bf9646d0792" gracePeriod=2 Nov 22 03:17:38 crc kubenswrapper[4952]: I1122 03:17:38.697414 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f9x48" Nov 22 03:17:38 crc kubenswrapper[4952]: I1122 03:17:38.773107 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmmtw\" (UniqueName: \"kubernetes.io/projected/281501c4-600a-48e4-8392-77d0d20cdab6-kube-api-access-nmmtw\") pod \"281501c4-600a-48e4-8392-77d0d20cdab6\" (UID: \"281501c4-600a-48e4-8392-77d0d20cdab6\") " Nov 22 03:17:38 crc kubenswrapper[4952]: I1122 03:17:38.773249 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/281501c4-600a-48e4-8392-77d0d20cdab6-catalog-content\") pod \"281501c4-600a-48e4-8392-77d0d20cdab6\" (UID: \"281501c4-600a-48e4-8392-77d0d20cdab6\") " Nov 22 03:17:38 crc kubenswrapper[4952]: I1122 03:17:38.773870 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/281501c4-600a-48e4-8392-77d0d20cdab6-utilities\") pod \"281501c4-600a-48e4-8392-77d0d20cdab6\" (UID: \"281501c4-600a-48e4-8392-77d0d20cdab6\") " Nov 22 03:17:38 crc kubenswrapper[4952]: I1122 03:17:38.774507 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/281501c4-600a-48e4-8392-77d0d20cdab6-utilities" (OuterVolumeSpecName: "utilities") pod "281501c4-600a-48e4-8392-77d0d20cdab6" (UID: "281501c4-600a-48e4-8392-77d0d20cdab6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:17:38 crc kubenswrapper[4952]: I1122 03:17:38.781916 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/281501c4-600a-48e4-8392-77d0d20cdab6-kube-api-access-nmmtw" (OuterVolumeSpecName: "kube-api-access-nmmtw") pod "281501c4-600a-48e4-8392-77d0d20cdab6" (UID: "281501c4-600a-48e4-8392-77d0d20cdab6"). InnerVolumeSpecName "kube-api-access-nmmtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:17:38 crc kubenswrapper[4952]: I1122 03:17:38.793634 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/281501c4-600a-48e4-8392-77d0d20cdab6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "281501c4-600a-48e4-8392-77d0d20cdab6" (UID: "281501c4-600a-48e4-8392-77d0d20cdab6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:17:38 crc kubenswrapper[4952]: I1122 03:17:38.876375 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/281501c4-600a-48e4-8392-77d0d20cdab6-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:17:38 crc kubenswrapper[4952]: I1122 03:17:38.876429 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmmtw\" (UniqueName: \"kubernetes.io/projected/281501c4-600a-48e4-8392-77d0d20cdab6-kube-api-access-nmmtw\") on node \"crc\" DevicePath \"\"" Nov 22 03:17:38 crc kubenswrapper[4952]: I1122 03:17:38.876447 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/281501c4-600a-48e4-8392-77d0d20cdab6-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:17:39 crc kubenswrapper[4952]: I1122 03:17:39.135682 4952 generic.go:334] "Generic (PLEG): container finished" podID="281501c4-600a-48e4-8392-77d0d20cdab6" containerID="364a9b62bccfa75d556a438c77727bd1f5d51415a737382abf333bf9646d0792" exitCode=0 Nov 22 03:17:39 crc kubenswrapper[4952]: I1122 03:17:39.135735 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9x48" event={"ID":"281501c4-600a-48e4-8392-77d0d20cdab6","Type":"ContainerDied","Data":"364a9b62bccfa75d556a438c77727bd1f5d51415a737382abf333bf9646d0792"} Nov 22 03:17:39 crc kubenswrapper[4952]: I1122 03:17:39.135758 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f9x48" Nov 22 03:17:39 crc kubenswrapper[4952]: I1122 03:17:39.135775 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9x48" event={"ID":"281501c4-600a-48e4-8392-77d0d20cdab6","Type":"ContainerDied","Data":"387f418cfaaa1321b7869e99252e73e44ef7c5a2c5c89b43517d67492c908b66"} Nov 22 03:17:39 crc kubenswrapper[4952]: I1122 03:17:39.135797 4952 scope.go:117] "RemoveContainer" containerID="364a9b62bccfa75d556a438c77727bd1f5d51415a737382abf333bf9646d0792" Nov 22 03:17:39 crc kubenswrapper[4952]: I1122 03:17:39.162361 4952 scope.go:117] "RemoveContainer" containerID="74c90c335073ba91ca05825df044b3adc081e2a24d32e310a8c9200834dd525b" Nov 22 03:17:39 crc kubenswrapper[4952]: I1122 03:17:39.179911 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f9x48"] Nov 22 03:17:39 crc kubenswrapper[4952]: I1122 03:17:39.196031 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f9x48"] Nov 22 03:17:39 crc kubenswrapper[4952]: I1122 03:17:39.204532 4952 scope.go:117] "RemoveContainer" containerID="f60321c2e8efa14f2e0a5bbdcc56c8567e2612b4f5ec3c69ffa29648a100ee7f" Nov 22 03:17:39 crc kubenswrapper[4952]: I1122 03:17:39.241095 4952 scope.go:117] "RemoveContainer" containerID="364a9b62bccfa75d556a438c77727bd1f5d51415a737382abf333bf9646d0792" Nov 22 03:17:39 crc kubenswrapper[4952]: E1122 03:17:39.241913 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"364a9b62bccfa75d556a438c77727bd1f5d51415a737382abf333bf9646d0792\": container with ID starting with 364a9b62bccfa75d556a438c77727bd1f5d51415a737382abf333bf9646d0792 not found: ID does not exist" containerID="364a9b62bccfa75d556a438c77727bd1f5d51415a737382abf333bf9646d0792" Nov 22 03:17:39 crc kubenswrapper[4952]: I1122 03:17:39.242008 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"364a9b62bccfa75d556a438c77727bd1f5d51415a737382abf333bf9646d0792"} err="failed to get container status \"364a9b62bccfa75d556a438c77727bd1f5d51415a737382abf333bf9646d0792\": rpc error: code = NotFound desc = could not find container \"364a9b62bccfa75d556a438c77727bd1f5d51415a737382abf333bf9646d0792\": container with ID starting with 364a9b62bccfa75d556a438c77727bd1f5d51415a737382abf333bf9646d0792 not found: ID does not exist" Nov 22 03:17:39 crc kubenswrapper[4952]: I1122 03:17:39.242053 4952 scope.go:117] "RemoveContainer" containerID="74c90c335073ba91ca05825df044b3adc081e2a24d32e310a8c9200834dd525b" Nov 22 03:17:39 crc kubenswrapper[4952]: E1122 03:17:39.242621 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74c90c335073ba91ca05825df044b3adc081e2a24d32e310a8c9200834dd525b\": container with ID starting with 74c90c335073ba91ca05825df044b3adc081e2a24d32e310a8c9200834dd525b not found: ID does not exist" containerID="74c90c335073ba91ca05825df044b3adc081e2a24d32e310a8c9200834dd525b" Nov 22 03:17:39 crc kubenswrapper[4952]: I1122 03:17:39.242657 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c90c335073ba91ca05825df044b3adc081e2a24d32e310a8c9200834dd525b"} err="failed to get container status \"74c90c335073ba91ca05825df044b3adc081e2a24d32e310a8c9200834dd525b\": rpc error: code = NotFound desc = could not find container \"74c90c335073ba91ca05825df044b3adc081e2a24d32e310a8c9200834dd525b\": container with ID starting with 74c90c335073ba91ca05825df044b3adc081e2a24d32e310a8c9200834dd525b not found: ID does not exist" Nov 22 03:17:39 crc kubenswrapper[4952]: I1122 03:17:39.242679 4952 scope.go:117] "RemoveContainer" containerID="f60321c2e8efa14f2e0a5bbdcc56c8567e2612b4f5ec3c69ffa29648a100ee7f" Nov 22 03:17:39 crc kubenswrapper[4952]: E1122 03:17:39.243268 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f60321c2e8efa14f2e0a5bbdcc56c8567e2612b4f5ec3c69ffa29648a100ee7f\": container with ID starting with f60321c2e8efa14f2e0a5bbdcc56c8567e2612b4f5ec3c69ffa29648a100ee7f not found: ID does not exist" containerID="f60321c2e8efa14f2e0a5bbdcc56c8567e2612b4f5ec3c69ffa29648a100ee7f" Nov 22 03:17:39 crc kubenswrapper[4952]: I1122 03:17:39.243297 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f60321c2e8efa14f2e0a5bbdcc56c8567e2612b4f5ec3c69ffa29648a100ee7f"} err="failed to get container status \"f60321c2e8efa14f2e0a5bbdcc56c8567e2612b4f5ec3c69ffa29648a100ee7f\": rpc error: code = NotFound desc = could not find container \"f60321c2e8efa14f2e0a5bbdcc56c8567e2612b4f5ec3c69ffa29648a100ee7f\": container with ID starting with f60321c2e8efa14f2e0a5bbdcc56c8567e2612b4f5ec3c69ffa29648a100ee7f not found: ID does not exist" Nov 22 03:17:40 crc kubenswrapper[4952]: I1122 03:17:40.546565 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="281501c4-600a-48e4-8392-77d0d20cdab6" path="/var/lib/kubelet/pods/281501c4-600a-48e4-8392-77d0d20cdab6/volumes" Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.568906 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vpllg"] Nov 22 03:17:41 crc kubenswrapper[4952]: E1122 03:17:41.569401 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="281501c4-600a-48e4-8392-77d0d20cdab6" containerName="extract-utilities" Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.569417 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="281501c4-600a-48e4-8392-77d0d20cdab6" containerName="extract-utilities" Nov 22 03:17:41 crc kubenswrapper[4952]: E1122 03:17:41.569434 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39" containerName="registry-server" Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.569440 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39" containerName="registry-server" Nov 22 03:17:41 crc kubenswrapper[4952]: E1122 03:17:41.569468 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="281501c4-600a-48e4-8392-77d0d20cdab6" containerName="registry-server" Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.569474 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="281501c4-600a-48e4-8392-77d0d20cdab6" containerName="registry-server" Nov 22 03:17:41 crc kubenswrapper[4952]: E1122 03:17:41.569484 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39" containerName="extract-content" Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.569492 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39" containerName="extract-content" Nov 22 03:17:41 crc kubenswrapper[4952]: E1122 03:17:41.569507 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39" containerName="extract-utilities" Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.569513 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39" containerName="extract-utilities" Nov 22 03:17:41 crc kubenswrapper[4952]: E1122 03:17:41.569525 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="281501c4-600a-48e4-8392-77d0d20cdab6" containerName="extract-content" Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.569532 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="281501c4-600a-48e4-8392-77d0d20cdab6" containerName="extract-content" Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.569952 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="281501c4-600a-48e4-8392-77d0d20cdab6" containerName="registry-server" Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.569978 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab1710d4-7ecf-4fb9-90d5-c6fed0b5bb39" containerName="registry-server" Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.573182 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vpllg" Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.610238 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vpllg"] Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.639734 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3b1042f-183d-4b73-8cfb-db7dbbb4f373-utilities\") pod \"certified-operators-vpllg\" (UID: \"e3b1042f-183d-4b73-8cfb-db7dbbb4f373\") " pod="openshift-marketplace/certified-operators-vpllg" Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.639825 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vcx4\" (UniqueName: \"kubernetes.io/projected/e3b1042f-183d-4b73-8cfb-db7dbbb4f373-kube-api-access-5vcx4\") pod \"certified-operators-vpllg\" (UID: \"e3b1042f-183d-4b73-8cfb-db7dbbb4f373\") " pod="openshift-marketplace/certified-operators-vpllg" Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.640879 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3b1042f-183d-4b73-8cfb-db7dbbb4f373-catalog-content\") pod \"certified-operators-vpllg\" (UID: \"e3b1042f-183d-4b73-8cfb-db7dbbb4f373\") " pod="openshift-marketplace/certified-operators-vpllg" Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.744055 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3b1042f-183d-4b73-8cfb-db7dbbb4f373-utilities\") pod \"certified-operators-vpllg\" (UID: \"e3b1042f-183d-4b73-8cfb-db7dbbb4f373\") " pod="openshift-marketplace/certified-operators-vpllg" Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.744161 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vcx4\" (UniqueName: \"kubernetes.io/projected/e3b1042f-183d-4b73-8cfb-db7dbbb4f373-kube-api-access-5vcx4\") pod \"certified-operators-vpllg\" (UID: \"e3b1042f-183d-4b73-8cfb-db7dbbb4f373\") " pod="openshift-marketplace/certified-operators-vpllg" Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.744235 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3b1042f-183d-4b73-8cfb-db7dbbb4f373-catalog-content\") pod \"certified-operators-vpllg\" (UID: \"e3b1042f-183d-4b73-8cfb-db7dbbb4f373\") " pod="openshift-marketplace/certified-operators-vpllg" Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.745016 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3b1042f-183d-4b73-8cfb-db7dbbb4f373-catalog-content\") pod \"certified-operators-vpllg\" (UID: \"e3b1042f-183d-4b73-8cfb-db7dbbb4f373\") " pod="openshift-marketplace/certified-operators-vpllg" Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.745292 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3b1042f-183d-4b73-8cfb-db7dbbb4f373-utilities\") pod \"certified-operators-vpllg\" (UID: \"e3b1042f-183d-4b73-8cfb-db7dbbb4f373\") " pod="openshift-marketplace/certified-operators-vpllg" Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.778063 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vcx4\" (UniqueName: \"kubernetes.io/projected/e3b1042f-183d-4b73-8cfb-db7dbbb4f373-kube-api-access-5vcx4\") pod \"certified-operators-vpllg\" (UID: \"e3b1042f-183d-4b73-8cfb-db7dbbb4f373\") " pod="openshift-marketplace/certified-operators-vpllg" Nov 22 03:17:41 crc kubenswrapper[4952]: I1122 03:17:41.900302 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vpllg" Nov 22 03:17:42 crc kubenswrapper[4952]: I1122 03:17:42.469576 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vpllg"] Nov 22 03:17:43 crc kubenswrapper[4952]: I1122 03:17:43.201353 4952 generic.go:334] "Generic (PLEG): container finished" podID="e3b1042f-183d-4b73-8cfb-db7dbbb4f373" containerID="58103a5643a5c3a0b878cdf6039eaba51cd0514178dc9b67a85cd7c1558f653f" exitCode=0 Nov 22 03:17:43 crc kubenswrapper[4952]: I1122 03:17:43.201444 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpllg" event={"ID":"e3b1042f-183d-4b73-8cfb-db7dbbb4f373","Type":"ContainerDied","Data":"58103a5643a5c3a0b878cdf6039eaba51cd0514178dc9b67a85cd7c1558f653f"} Nov 22 03:17:43 crc kubenswrapper[4952]: I1122 03:17:43.201903 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpllg" event={"ID":"e3b1042f-183d-4b73-8cfb-db7dbbb4f373","Type":"ContainerStarted","Data":"b039b63ec911434aa095dc561ac48d9f3dddb63aece169f33714829810ef3ec6"} Nov 22 03:17:45 crc kubenswrapper[4952]: I1122 03:17:45.233324 4952 generic.go:334] "Generic (PLEG): container finished" podID="e3b1042f-183d-4b73-8cfb-db7dbbb4f373" containerID="ce03f4c4b1d02d50166c07a68cd1349c8dd86529625b666c9128b9826c69c718" exitCode=0 Nov 22 03:17:45 crc kubenswrapper[4952]: I1122 03:17:45.233396 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpllg" event={"ID":"e3b1042f-183d-4b73-8cfb-db7dbbb4f373","Type":"ContainerDied","Data":"ce03f4c4b1d02d50166c07a68cd1349c8dd86529625b666c9128b9826c69c718"} Nov 22 03:17:46 crc kubenswrapper[4952]: I1122 03:17:46.248612 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpllg" event={"ID":"e3b1042f-183d-4b73-8cfb-db7dbbb4f373","Type":"ContainerStarted","Data":"20b5dde67e0057012540312456b5c96b30f0a648f374abdfbebf5db4c73ba316"} Nov 22 03:17:46 crc kubenswrapper[4952]: I1122 03:17:46.274429 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vpllg" podStartSLOduration=2.834069021 podStartE2EDuration="5.274410428s" podCreationTimestamp="2025-11-22 03:17:41 +0000 UTC" firstStartedPulling="2025-11-22 03:17:43.204866347 +0000 UTC m=+1427.510883620" lastFinishedPulling="2025-11-22 03:17:45.645207744 +0000 UTC m=+1429.951225027" observedRunningTime="2025-11-22 03:17:46.265359968 +0000 UTC m=+1430.571377271" watchObservedRunningTime="2025-11-22 03:17:46.274410428 +0000 UTC m=+1430.580427691" Nov 22 03:17:51 crc kubenswrapper[4952]: I1122 03:17:51.901301 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vpllg" Nov 22 03:17:51 crc kubenswrapper[4952]: I1122 03:17:51.902579 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vpllg" Nov 22 03:17:51 crc kubenswrapper[4952]: I1122 03:17:51.975779 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vpllg" Nov 22 03:17:52 crc kubenswrapper[4952]: I1122 03:17:52.391410 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vpllg" Nov 22 03:17:52 crc kubenswrapper[4952]: I1122 03:17:52.448949 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vpllg"] Nov 22 03:17:54 crc kubenswrapper[4952]: I1122 03:17:54.344520 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vpllg" podUID="e3b1042f-183d-4b73-8cfb-db7dbbb4f373" containerName="registry-server" containerID="cri-o://20b5dde67e0057012540312456b5c96b30f0a648f374abdfbebf5db4c73ba316" gracePeriod=2 Nov 22 03:17:54 crc kubenswrapper[4952]: I1122 03:17:54.788644 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vpllg" Nov 22 03:17:54 crc kubenswrapper[4952]: I1122 03:17:54.913253 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3b1042f-183d-4b73-8cfb-db7dbbb4f373-catalog-content\") pod \"e3b1042f-183d-4b73-8cfb-db7dbbb4f373\" (UID: \"e3b1042f-183d-4b73-8cfb-db7dbbb4f373\") " Nov 22 03:17:54 crc kubenswrapper[4952]: I1122 03:17:54.913956 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vcx4\" (UniqueName: \"kubernetes.io/projected/e3b1042f-183d-4b73-8cfb-db7dbbb4f373-kube-api-access-5vcx4\") pod \"e3b1042f-183d-4b73-8cfb-db7dbbb4f373\" (UID: \"e3b1042f-183d-4b73-8cfb-db7dbbb4f373\") " Nov 22 03:17:54 crc kubenswrapper[4952]: I1122 03:17:54.914000 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3b1042f-183d-4b73-8cfb-db7dbbb4f373-utilities\") pod \"e3b1042f-183d-4b73-8cfb-db7dbbb4f373\" (UID: \"e3b1042f-183d-4b73-8cfb-db7dbbb4f373\") " Nov 22 03:17:54 crc kubenswrapper[4952]: I1122 03:17:54.914908 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3b1042f-183d-4b73-8cfb-db7dbbb4f373-utilities" (OuterVolumeSpecName: "utilities") pod "e3b1042f-183d-4b73-8cfb-db7dbbb4f373" (UID: "e3b1042f-183d-4b73-8cfb-db7dbbb4f373"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:17:54 crc kubenswrapper[4952]: I1122 03:17:54.923531 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b1042f-183d-4b73-8cfb-db7dbbb4f373-kube-api-access-5vcx4" (OuterVolumeSpecName: "kube-api-access-5vcx4") pod "e3b1042f-183d-4b73-8cfb-db7dbbb4f373" (UID: "e3b1042f-183d-4b73-8cfb-db7dbbb4f373"). InnerVolumeSpecName "kube-api-access-5vcx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:17:55 crc kubenswrapper[4952]: I1122 03:17:55.017946 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vcx4\" (UniqueName: \"kubernetes.io/projected/e3b1042f-183d-4b73-8cfb-db7dbbb4f373-kube-api-access-5vcx4\") on node \"crc\" DevicePath \"\"" Nov 22 03:17:55 crc kubenswrapper[4952]: I1122 03:17:55.018009 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3b1042f-183d-4b73-8cfb-db7dbbb4f373-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:17:55 crc kubenswrapper[4952]: I1122 03:17:55.071781 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3b1042f-183d-4b73-8cfb-db7dbbb4f373-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3b1042f-183d-4b73-8cfb-db7dbbb4f373" (UID: "e3b1042f-183d-4b73-8cfb-db7dbbb4f373"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:17:55 crc kubenswrapper[4952]: I1122 03:17:55.119651 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3b1042f-183d-4b73-8cfb-db7dbbb4f373-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:17:55 crc kubenswrapper[4952]: I1122 03:17:55.360774 4952 generic.go:334] "Generic (PLEG): container finished" podID="e3b1042f-183d-4b73-8cfb-db7dbbb4f373" containerID="20b5dde67e0057012540312456b5c96b30f0a648f374abdfbebf5db4c73ba316" exitCode=0 Nov 22 03:17:55 crc kubenswrapper[4952]: I1122 03:17:55.360862 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpllg" event={"ID":"e3b1042f-183d-4b73-8cfb-db7dbbb4f373","Type":"ContainerDied","Data":"20b5dde67e0057012540312456b5c96b30f0a648f374abdfbebf5db4c73ba316"} Nov 22 03:17:55 crc kubenswrapper[4952]: I1122 03:17:55.360886 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vpllg" Nov 22 03:17:55 crc kubenswrapper[4952]: I1122 03:17:55.360955 4952 scope.go:117] "RemoveContainer" containerID="20b5dde67e0057012540312456b5c96b30f0a648f374abdfbebf5db4c73ba316" Nov 22 03:17:55 crc kubenswrapper[4952]: I1122 03:17:55.360935 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpllg" event={"ID":"e3b1042f-183d-4b73-8cfb-db7dbbb4f373","Type":"ContainerDied","Data":"b039b63ec911434aa095dc561ac48d9f3dddb63aece169f33714829810ef3ec6"} Nov 22 03:17:55 crc kubenswrapper[4952]: I1122 03:17:55.384841 4952 scope.go:117] "RemoveContainer" containerID="ce03f4c4b1d02d50166c07a68cd1349c8dd86529625b666c9128b9826c69c718" Nov 22 03:17:55 crc kubenswrapper[4952]: I1122 03:17:55.405723 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vpllg"] Nov 22 03:17:55 crc kubenswrapper[4952]: I1122 03:17:55.414661 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vpllg"] Nov 22 03:17:55 crc kubenswrapper[4952]: I1122 03:17:55.423828 4952 scope.go:117] "RemoveContainer" containerID="58103a5643a5c3a0b878cdf6039eaba51cd0514178dc9b67a85cd7c1558f653f" Nov 22 03:17:55 crc kubenswrapper[4952]: I1122 03:17:55.463740 4952 scope.go:117] "RemoveContainer" containerID="20b5dde67e0057012540312456b5c96b30f0a648f374abdfbebf5db4c73ba316" Nov 22 03:17:55 crc kubenswrapper[4952]: E1122 03:17:55.464562 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20b5dde67e0057012540312456b5c96b30f0a648f374abdfbebf5db4c73ba316\": container with ID starting with 20b5dde67e0057012540312456b5c96b30f0a648f374abdfbebf5db4c73ba316 not found: ID does not exist" containerID="20b5dde67e0057012540312456b5c96b30f0a648f374abdfbebf5db4c73ba316" Nov 22 03:17:55 crc kubenswrapper[4952]: I1122 03:17:55.464627 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20b5dde67e0057012540312456b5c96b30f0a648f374abdfbebf5db4c73ba316"} err="failed to get container status \"20b5dde67e0057012540312456b5c96b30f0a648f374abdfbebf5db4c73ba316\": rpc error: code = NotFound desc = could not find container \"20b5dde67e0057012540312456b5c96b30f0a648f374abdfbebf5db4c73ba316\": container with ID starting with 20b5dde67e0057012540312456b5c96b30f0a648f374abdfbebf5db4c73ba316 not found: ID does not exist" Nov 22 03:17:55 crc kubenswrapper[4952]: I1122 03:17:55.464668 4952 scope.go:117] "RemoveContainer" containerID="ce03f4c4b1d02d50166c07a68cd1349c8dd86529625b666c9128b9826c69c718" Nov 22 03:17:55 crc kubenswrapper[4952]: E1122 03:17:55.465374 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce03f4c4b1d02d50166c07a68cd1349c8dd86529625b666c9128b9826c69c718\": container with ID starting with ce03f4c4b1d02d50166c07a68cd1349c8dd86529625b666c9128b9826c69c718 not found: ID does not exist" containerID="ce03f4c4b1d02d50166c07a68cd1349c8dd86529625b666c9128b9826c69c718" Nov 22 03:17:55 crc kubenswrapper[4952]: I1122 03:17:55.465442 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce03f4c4b1d02d50166c07a68cd1349c8dd86529625b666c9128b9826c69c718"} err="failed to get container status \"ce03f4c4b1d02d50166c07a68cd1349c8dd86529625b666c9128b9826c69c718\": rpc error: code = NotFound desc = could not find container \"ce03f4c4b1d02d50166c07a68cd1349c8dd86529625b666c9128b9826c69c718\": container with ID starting with ce03f4c4b1d02d50166c07a68cd1349c8dd86529625b666c9128b9826c69c718 not found: ID does not exist" Nov 22 03:17:55 crc kubenswrapper[4952]: I1122 03:17:55.465498 4952 scope.go:117] "RemoveContainer" containerID="58103a5643a5c3a0b878cdf6039eaba51cd0514178dc9b67a85cd7c1558f653f" Nov 22 03:17:55 crc kubenswrapper[4952]: E1122 03:17:55.466666 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58103a5643a5c3a0b878cdf6039eaba51cd0514178dc9b67a85cd7c1558f653f\": container with ID starting with 58103a5643a5c3a0b878cdf6039eaba51cd0514178dc9b67a85cd7c1558f653f not found: ID does not exist" containerID="58103a5643a5c3a0b878cdf6039eaba51cd0514178dc9b67a85cd7c1558f653f" Nov 22 03:17:55 crc kubenswrapper[4952]: I1122 03:17:55.466705 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58103a5643a5c3a0b878cdf6039eaba51cd0514178dc9b67a85cd7c1558f653f"} err="failed to get container status \"58103a5643a5c3a0b878cdf6039eaba51cd0514178dc9b67a85cd7c1558f653f\": rpc error: code = NotFound desc = could not find container \"58103a5643a5c3a0b878cdf6039eaba51cd0514178dc9b67a85cd7c1558f653f\": container with ID starting with 58103a5643a5c3a0b878cdf6039eaba51cd0514178dc9b67a85cd7c1558f653f not found: ID does not exist" Nov 22 03:17:56 crc kubenswrapper[4952]: I1122 03:17:56.550392 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b1042f-183d-4b73-8cfb-db7dbbb4f373" path="/var/lib/kubelet/pods/e3b1042f-183d-4b73-8cfb-db7dbbb4f373/volumes" Nov 22 03:17:58 crc kubenswrapper[4952]: I1122 03:17:58.342048 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:17:58 crc kubenswrapper[4952]: I1122 03:17:58.342636 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:18:28 crc kubenswrapper[4952]: I1122 03:18:28.342157 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:18:28 crc kubenswrapper[4952]: I1122 03:18:28.343075 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:18:30 crc kubenswrapper[4952]: I1122 03:18:30.867023 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lml77"] Nov 22 03:18:30 crc kubenswrapper[4952]: E1122 03:18:30.867838 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b1042f-183d-4b73-8cfb-db7dbbb4f373" containerName="extract-utilities" Nov 22 03:18:30 crc kubenswrapper[4952]: I1122 03:18:30.867855 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b1042f-183d-4b73-8cfb-db7dbbb4f373" containerName="extract-utilities" Nov 22 03:18:30 crc kubenswrapper[4952]: E1122 03:18:30.867875 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b1042f-183d-4b73-8cfb-db7dbbb4f373" containerName="registry-server" Nov 22 03:18:30 crc kubenswrapper[4952]: I1122 03:18:30.867883 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b1042f-183d-4b73-8cfb-db7dbbb4f373" containerName="registry-server" Nov 22 03:18:30 crc kubenswrapper[4952]: E1122 03:18:30.867897 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b1042f-183d-4b73-8cfb-db7dbbb4f373" containerName="extract-content" Nov 22 03:18:30 crc kubenswrapper[4952]: I1122 03:18:30.867904 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b1042f-183d-4b73-8cfb-db7dbbb4f373" containerName="extract-content" Nov 22 03:18:30 crc kubenswrapper[4952]: I1122 03:18:30.868106 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b1042f-183d-4b73-8cfb-db7dbbb4f373" containerName="registry-server" Nov 22 03:18:30 crc kubenswrapper[4952]: I1122 03:18:30.873044 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lml77" Nov 22 03:18:30 crc kubenswrapper[4952]: I1122 03:18:30.884107 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lml77"] Nov 22 03:18:30 crc kubenswrapper[4952]: I1122 03:18:30.986517 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfb36133-2b7d-41a6-b3b2-3c5b942903e0-catalog-content\") pod \"redhat-operators-lml77\" (UID: \"bfb36133-2b7d-41a6-b3b2-3c5b942903e0\") " pod="openshift-marketplace/redhat-operators-lml77" Nov 22 03:18:30 crc kubenswrapper[4952]: I1122 03:18:30.986754 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfb36133-2b7d-41a6-b3b2-3c5b942903e0-utilities\") pod \"redhat-operators-lml77\" (UID: \"bfb36133-2b7d-41a6-b3b2-3c5b942903e0\") " pod="openshift-marketplace/redhat-operators-lml77" Nov 22 03:18:30 crc kubenswrapper[4952]: I1122 03:18:30.986850 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzbj7\" (UniqueName: \"kubernetes.io/projected/bfb36133-2b7d-41a6-b3b2-3c5b942903e0-kube-api-access-fzbj7\") pod \"redhat-operators-lml77\" (UID: \"bfb36133-2b7d-41a6-b3b2-3c5b942903e0\") " pod="openshift-marketplace/redhat-operators-lml77" Nov 22 03:18:31 crc kubenswrapper[4952]: I1122 03:18:31.089248 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfb36133-2b7d-41a6-b3b2-3c5b942903e0-utilities\") pod \"redhat-operators-lml77\" (UID: \"bfb36133-2b7d-41a6-b3b2-3c5b942903e0\") " pod="openshift-marketplace/redhat-operators-lml77" Nov 22 03:18:31 crc kubenswrapper[4952]: I1122 03:18:31.089337 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzbj7\" (UniqueName: \"kubernetes.io/projected/bfb36133-2b7d-41a6-b3b2-3c5b942903e0-kube-api-access-fzbj7\") pod \"redhat-operators-lml77\" (UID: \"bfb36133-2b7d-41a6-b3b2-3c5b942903e0\") " pod="openshift-marketplace/redhat-operators-lml77" Nov 22 03:18:31 crc kubenswrapper[4952]: I1122 03:18:31.089426 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfb36133-2b7d-41a6-b3b2-3c5b942903e0-catalog-content\") pod \"redhat-operators-lml77\" (UID: \"bfb36133-2b7d-41a6-b3b2-3c5b942903e0\") " pod="openshift-marketplace/redhat-operators-lml77" Nov 22 03:18:31 crc kubenswrapper[4952]: I1122 03:18:31.090135 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfb36133-2b7d-41a6-b3b2-3c5b942903e0-utilities\") pod \"redhat-operators-lml77\" (UID: \"bfb36133-2b7d-41a6-b3b2-3c5b942903e0\") " pod="openshift-marketplace/redhat-operators-lml77" Nov 22 03:18:31 crc kubenswrapper[4952]: I1122 03:18:31.090150 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfb36133-2b7d-41a6-b3b2-3c5b942903e0-catalog-content\") pod \"redhat-operators-lml77\" (UID: \"bfb36133-2b7d-41a6-b3b2-3c5b942903e0\") " pod="openshift-marketplace/redhat-operators-lml77" Nov 22 03:18:31 crc kubenswrapper[4952]: I1122 03:18:31.116023 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzbj7\" (UniqueName: \"kubernetes.io/projected/bfb36133-2b7d-41a6-b3b2-3c5b942903e0-kube-api-access-fzbj7\") pod \"redhat-operators-lml77\" (UID: \"bfb36133-2b7d-41a6-b3b2-3c5b942903e0\") " pod="openshift-marketplace/redhat-operators-lml77" Nov 22 03:18:31 crc kubenswrapper[4952]: I1122 03:18:31.204101 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lml77" Nov 22 03:18:31 crc kubenswrapper[4952]: I1122 03:18:31.723652 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lml77"] Nov 22 03:18:31 crc kubenswrapper[4952]: I1122 03:18:31.781853 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lml77" event={"ID":"bfb36133-2b7d-41a6-b3b2-3c5b942903e0","Type":"ContainerStarted","Data":"7d0cc96a47f29e03f9bc1b0b9ffbabaf2fcb3f92fc3cb34897d5c278f833b49f"} Nov 22 03:18:32 crc kubenswrapper[4952]: I1122 03:18:32.795631 4952 generic.go:334] "Generic (PLEG): container finished" podID="bfb36133-2b7d-41a6-b3b2-3c5b942903e0" containerID="c62de209f5fa2e90b19c82abd19c8149e1e62f62e1f8a503c1b20282bdd0e33a" exitCode=0 Nov 22 03:18:32 crc kubenswrapper[4952]: I1122 03:18:32.795695 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lml77" event={"ID":"bfb36133-2b7d-41a6-b3b2-3c5b942903e0","Type":"ContainerDied","Data":"c62de209f5fa2e90b19c82abd19c8149e1e62f62e1f8a503c1b20282bdd0e33a"} Nov 22 03:18:33 crc kubenswrapper[4952]: I1122 03:18:33.808139 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lml77" event={"ID":"bfb36133-2b7d-41a6-b3b2-3c5b942903e0","Type":"ContainerStarted","Data":"b06dc31dc56ec967318d8bd6d526b702cae9271a5ca541b037b51fc0ddaf34ec"} Nov 22 03:18:35 crc kubenswrapper[4952]: I1122 03:18:35.832741 4952 generic.go:334] "Generic (PLEG): container finished" podID="bfb36133-2b7d-41a6-b3b2-3c5b942903e0" containerID="b06dc31dc56ec967318d8bd6d526b702cae9271a5ca541b037b51fc0ddaf34ec" exitCode=0 Nov 22 03:18:35 crc kubenswrapper[4952]: I1122 03:18:35.832816 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lml77" event={"ID":"bfb36133-2b7d-41a6-b3b2-3c5b942903e0","Type":"ContainerDied","Data":"b06dc31dc56ec967318d8bd6d526b702cae9271a5ca541b037b51fc0ddaf34ec"} Nov 22 03:18:36 crc kubenswrapper[4952]: I1122 03:18:36.844812 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lml77" event={"ID":"bfb36133-2b7d-41a6-b3b2-3c5b942903e0","Type":"ContainerStarted","Data":"3098132d9c8457b6674d130e2e5a35484aad4dc2689d154cb196544ee64699f9"} Nov 22 03:18:36 crc kubenswrapper[4952]: I1122 03:18:36.874227 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lml77" podStartSLOduration=3.193218619 podStartE2EDuration="6.874198452s" podCreationTimestamp="2025-11-22 03:18:30 +0000 UTC" firstStartedPulling="2025-11-22 03:18:32.79905258 +0000 UTC m=+1477.105069853" lastFinishedPulling="2025-11-22 03:18:36.480032403 +0000 UTC m=+1480.786049686" observedRunningTime="2025-11-22 03:18:36.864373662 +0000 UTC m=+1481.170390945" watchObservedRunningTime="2025-11-22 03:18:36.874198452 +0000 UTC m=+1481.180215725" Nov 22 03:18:38 crc kubenswrapper[4952]: I1122 03:18:38.862521 4952 generic.go:334] "Generic (PLEG): container finished" podID="562bf029-85bf-47a6-b4a7-913eb130f85b" containerID="fce6a3395361c8740b39d4467840427834d826c7b1fb32c5f5321af038786ec7" exitCode=0 Nov 22 03:18:38 crc kubenswrapper[4952]: I1122 03:18:38.862596 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" event={"ID":"562bf029-85bf-47a6-b4a7-913eb130f85b","Type":"ContainerDied","Data":"fce6a3395361c8740b39d4467840427834d826c7b1fb32c5f5321af038786ec7"} Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.355171 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.414806 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/562bf029-85bf-47a6-b4a7-913eb130f85b-ssh-key\") pod \"562bf029-85bf-47a6-b4a7-913eb130f85b\" (UID: \"562bf029-85bf-47a6-b4a7-913eb130f85b\") " Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.414931 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/562bf029-85bf-47a6-b4a7-913eb130f85b-bootstrap-combined-ca-bundle\") pod \"562bf029-85bf-47a6-b4a7-913eb130f85b\" (UID: \"562bf029-85bf-47a6-b4a7-913eb130f85b\") " Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.415211 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/562bf029-85bf-47a6-b4a7-913eb130f85b-inventory\") pod \"562bf029-85bf-47a6-b4a7-913eb130f85b\" (UID: \"562bf029-85bf-47a6-b4a7-913eb130f85b\") " Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.415279 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkthp\" (UniqueName: \"kubernetes.io/projected/562bf029-85bf-47a6-b4a7-913eb130f85b-kube-api-access-zkthp\") pod \"562bf029-85bf-47a6-b4a7-913eb130f85b\" (UID: \"562bf029-85bf-47a6-b4a7-913eb130f85b\") " Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.421913 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/562bf029-85bf-47a6-b4a7-913eb130f85b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "562bf029-85bf-47a6-b4a7-913eb130f85b" (UID: "562bf029-85bf-47a6-b4a7-913eb130f85b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.422171 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/562bf029-85bf-47a6-b4a7-913eb130f85b-kube-api-access-zkthp" (OuterVolumeSpecName: "kube-api-access-zkthp") pod "562bf029-85bf-47a6-b4a7-913eb130f85b" (UID: "562bf029-85bf-47a6-b4a7-913eb130f85b"). InnerVolumeSpecName "kube-api-access-zkthp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.454249 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/562bf029-85bf-47a6-b4a7-913eb130f85b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "562bf029-85bf-47a6-b4a7-913eb130f85b" (UID: "562bf029-85bf-47a6-b4a7-913eb130f85b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.454535 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/562bf029-85bf-47a6-b4a7-913eb130f85b-inventory" (OuterVolumeSpecName: "inventory") pod "562bf029-85bf-47a6-b4a7-913eb130f85b" (UID: "562bf029-85bf-47a6-b4a7-913eb130f85b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.518114 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/562bf029-85bf-47a6-b4a7-913eb130f85b-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.518167 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkthp\" (UniqueName: \"kubernetes.io/projected/562bf029-85bf-47a6-b4a7-913eb130f85b-kube-api-access-zkthp\") on node \"crc\" DevicePath \"\"" Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.518183 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/562bf029-85bf-47a6-b4a7-913eb130f85b-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.518197 4952 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/562bf029-85bf-47a6-b4a7-913eb130f85b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.891147 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" event={"ID":"562bf029-85bf-47a6-b4a7-913eb130f85b","Type":"ContainerDied","Data":"6a76c613ae6b124544c444676da0c827615033df3d0adc69d094b823c84380f2"} Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.891227 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a76c613ae6b124544c444676da0c827615033df3d0adc69d094b823c84380f2" Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.891265 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc" Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.991785 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw"] Nov 22 03:18:40 crc kubenswrapper[4952]: E1122 03:18:40.992190 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="562bf029-85bf-47a6-b4a7-913eb130f85b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.992208 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="562bf029-85bf-47a6-b4a7-913eb130f85b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.992386 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="562bf029-85bf-47a6-b4a7-913eb130f85b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.993012 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw" Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.995740 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:18:40 crc kubenswrapper[4952]: I1122 03:18:40.998093 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:18:41 crc kubenswrapper[4952]: I1122 03:18:41.000095 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:18:41 crc kubenswrapper[4952]: I1122 03:18:41.009755 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:18:41 crc kubenswrapper[4952]: I1122 03:18:41.013858 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw"] Nov 22 03:18:41 crc kubenswrapper[4952]: I1122 03:18:41.034402 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm9pf\" (UniqueName: \"kubernetes.io/projected/460e80b9-651b-4dc8-a82e-f9a45d9f18ac-kube-api-access-pm9pf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-98ltw\" (UID: \"460e80b9-651b-4dc8-a82e-f9a45d9f18ac\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw" Nov 22 03:18:41 crc kubenswrapper[4952]: I1122 03:18:41.034495 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/460e80b9-651b-4dc8-a82e-f9a45d9f18ac-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-98ltw\" (UID: \"460e80b9-651b-4dc8-a82e-f9a45d9f18ac\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw" Nov 22 03:18:41 crc kubenswrapper[4952]: I1122 03:18:41.034659 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/460e80b9-651b-4dc8-a82e-f9a45d9f18ac-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-98ltw\" (UID: \"460e80b9-651b-4dc8-a82e-f9a45d9f18ac\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw" Nov 22 03:18:41 crc kubenswrapper[4952]: I1122 03:18:41.135809 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/460e80b9-651b-4dc8-a82e-f9a45d9f18ac-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-98ltw\" (UID: \"460e80b9-651b-4dc8-a82e-f9a45d9f18ac\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw" Nov 22 03:18:41 crc kubenswrapper[4952]: I1122 03:18:41.135889 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm9pf\" (UniqueName: \"kubernetes.io/projected/460e80b9-651b-4dc8-a82e-f9a45d9f18ac-kube-api-access-pm9pf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-98ltw\" (UID: \"460e80b9-651b-4dc8-a82e-f9a45d9f18ac\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw" Nov 22 03:18:41 crc kubenswrapper[4952]: I1122 03:18:41.135962 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/460e80b9-651b-4dc8-a82e-f9a45d9f18ac-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-98ltw\" (UID: \"460e80b9-651b-4dc8-a82e-f9a45d9f18ac\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw" Nov 22 03:18:41 crc kubenswrapper[4952]: I1122 03:18:41.142064 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/460e80b9-651b-4dc8-a82e-f9a45d9f18ac-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-98ltw\" (UID: \"460e80b9-651b-4dc8-a82e-f9a45d9f18ac\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw" Nov 22 03:18:41 crc kubenswrapper[4952]: I1122 03:18:41.143346 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/460e80b9-651b-4dc8-a82e-f9a45d9f18ac-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-98ltw\" (UID: \"460e80b9-651b-4dc8-a82e-f9a45d9f18ac\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw" Nov 22 03:18:41 crc kubenswrapper[4952]: I1122 03:18:41.157973 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm9pf\" (UniqueName: \"kubernetes.io/projected/460e80b9-651b-4dc8-a82e-f9a45d9f18ac-kube-api-access-pm9pf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-98ltw\" (UID: \"460e80b9-651b-4dc8-a82e-f9a45d9f18ac\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw" Nov 22 03:18:41 crc kubenswrapper[4952]: I1122 03:18:41.204989 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lml77" Nov 22 03:18:41 crc kubenswrapper[4952]: I1122 03:18:41.205618 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lml77" Nov 22 03:18:41 crc kubenswrapper[4952]: I1122 03:18:41.346816 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw" Nov 22 03:18:41 crc kubenswrapper[4952]: I1122 03:18:41.906464 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw"] Nov 22 03:18:42 crc kubenswrapper[4952]: I1122 03:18:42.259786 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lml77" podUID="bfb36133-2b7d-41a6-b3b2-3c5b942903e0" containerName="registry-server" probeResult="failure" output=< Nov 22 03:18:42 crc kubenswrapper[4952]: timeout: failed to connect service ":50051" within 1s Nov 22 03:18:42 crc kubenswrapper[4952]: > Nov 22 03:18:42 crc kubenswrapper[4952]: I1122 03:18:42.917429 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw" event={"ID":"460e80b9-651b-4dc8-a82e-f9a45d9f18ac","Type":"ContainerStarted","Data":"4a1a8716a112bd2735614675416540a91718d7bde91565c87bc377779d013126"} Nov 22 03:18:42 crc kubenswrapper[4952]: I1122 03:18:42.919473 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw" event={"ID":"460e80b9-651b-4dc8-a82e-f9a45d9f18ac","Type":"ContainerStarted","Data":"64f73227af297c13c7ff641113dc1bc724e2ac492f1bb2e67db8afdfee08ec87"} Nov 22 03:18:42 crc kubenswrapper[4952]: I1122 03:18:42.953979 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw" podStartSLOduration=2.479088195 podStartE2EDuration="2.95393989s" podCreationTimestamp="2025-11-22 03:18:40 +0000 UTC" firstStartedPulling="2025-11-22 03:18:41.90908259 +0000 UTC m=+1486.215099863" lastFinishedPulling="2025-11-22 03:18:42.383934285 +0000 UTC m=+1486.689951558" observedRunningTime="2025-11-22 03:18:42.93921251 +0000 UTC m=+1487.245229823" watchObservedRunningTime="2025-11-22 03:18:42.95393989 +0000 UTC m=+1487.259957173" Nov 22 03:18:51 crc kubenswrapper[4952]: I1122 03:18:51.274515 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lml77" Nov 22 03:18:51 crc kubenswrapper[4952]: I1122 03:18:51.353179 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lml77" Nov 22 03:18:51 crc kubenswrapper[4952]: I1122 03:18:51.539341 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lml77"] Nov 22 03:18:53 crc kubenswrapper[4952]: I1122 03:18:53.034495 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lml77" podUID="bfb36133-2b7d-41a6-b3b2-3c5b942903e0" containerName="registry-server" containerID="cri-o://3098132d9c8457b6674d130e2e5a35484aad4dc2689d154cb196544ee64699f9" gracePeriod=2 Nov 22 03:18:53 crc kubenswrapper[4952]: I1122 03:18:53.510627 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lml77" Nov 22 03:18:53 crc kubenswrapper[4952]: I1122 03:18:53.603153 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfb36133-2b7d-41a6-b3b2-3c5b942903e0-utilities\") pod \"bfb36133-2b7d-41a6-b3b2-3c5b942903e0\" (UID: \"bfb36133-2b7d-41a6-b3b2-3c5b942903e0\") " Nov 22 03:18:53 crc kubenswrapper[4952]: I1122 03:18:53.603663 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfb36133-2b7d-41a6-b3b2-3c5b942903e0-catalog-content\") pod \"bfb36133-2b7d-41a6-b3b2-3c5b942903e0\" (UID: \"bfb36133-2b7d-41a6-b3b2-3c5b942903e0\") " Nov 22 03:18:53 crc kubenswrapper[4952]: I1122 03:18:53.603768 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzbj7\" (UniqueName: \"kubernetes.io/projected/bfb36133-2b7d-41a6-b3b2-3c5b942903e0-kube-api-access-fzbj7\") pod \"bfb36133-2b7d-41a6-b3b2-3c5b942903e0\" (UID: \"bfb36133-2b7d-41a6-b3b2-3c5b942903e0\") " Nov 22 03:18:53 crc kubenswrapper[4952]: I1122 03:18:53.604854 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfb36133-2b7d-41a6-b3b2-3c5b942903e0-utilities" (OuterVolumeSpecName: "utilities") pod "bfb36133-2b7d-41a6-b3b2-3c5b942903e0" (UID: "bfb36133-2b7d-41a6-b3b2-3c5b942903e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:18:53 crc kubenswrapper[4952]: I1122 03:18:53.610527 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfb36133-2b7d-41a6-b3b2-3c5b942903e0-kube-api-access-fzbj7" (OuterVolumeSpecName: "kube-api-access-fzbj7") pod "bfb36133-2b7d-41a6-b3b2-3c5b942903e0" (UID: "bfb36133-2b7d-41a6-b3b2-3c5b942903e0"). InnerVolumeSpecName "kube-api-access-fzbj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:18:53 crc kubenswrapper[4952]: I1122 03:18:53.702608 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfb36133-2b7d-41a6-b3b2-3c5b942903e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfb36133-2b7d-41a6-b3b2-3c5b942903e0" (UID: "bfb36133-2b7d-41a6-b3b2-3c5b942903e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:18:53 crc kubenswrapper[4952]: I1122 03:18:53.706059 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfb36133-2b7d-41a6-b3b2-3c5b942903e0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:18:53 crc kubenswrapper[4952]: I1122 03:18:53.706082 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzbj7\" (UniqueName: \"kubernetes.io/projected/bfb36133-2b7d-41a6-b3b2-3c5b942903e0-kube-api-access-fzbj7\") on node \"crc\" DevicePath \"\"" Nov 22 03:18:53 crc kubenswrapper[4952]: I1122 03:18:53.706094 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfb36133-2b7d-41a6-b3b2-3c5b942903e0-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:18:54 crc kubenswrapper[4952]: I1122 03:18:54.049805 4952 generic.go:334] "Generic (PLEG): container finished" podID="bfb36133-2b7d-41a6-b3b2-3c5b942903e0" containerID="3098132d9c8457b6674d130e2e5a35484aad4dc2689d154cb196544ee64699f9" exitCode=0 Nov 22 03:18:54 crc kubenswrapper[4952]: I1122 03:18:54.049870 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lml77" event={"ID":"bfb36133-2b7d-41a6-b3b2-3c5b942903e0","Type":"ContainerDied","Data":"3098132d9c8457b6674d130e2e5a35484aad4dc2689d154cb196544ee64699f9"} Nov 22 03:18:54 crc kubenswrapper[4952]: I1122 03:18:54.049910 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lml77" event={"ID":"bfb36133-2b7d-41a6-b3b2-3c5b942903e0","Type":"ContainerDied","Data":"7d0cc96a47f29e03f9bc1b0b9ffbabaf2fcb3f92fc3cb34897d5c278f833b49f"} Nov 22 03:18:54 crc kubenswrapper[4952]: I1122 03:18:54.049938 4952 scope.go:117] "RemoveContainer" containerID="3098132d9c8457b6674d130e2e5a35484aad4dc2689d154cb196544ee64699f9" Nov 22 03:18:54 crc kubenswrapper[4952]: I1122 03:18:54.050070 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lml77" Nov 22 03:18:54 crc kubenswrapper[4952]: I1122 03:18:54.082625 4952 scope.go:117] "RemoveContainer" containerID="b06dc31dc56ec967318d8bd6d526b702cae9271a5ca541b037b51fc0ddaf34ec" Nov 22 03:18:54 crc kubenswrapper[4952]: I1122 03:18:54.103820 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lml77"] Nov 22 03:18:54 crc kubenswrapper[4952]: I1122 03:18:54.109756 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lml77"] Nov 22 03:18:54 crc kubenswrapper[4952]: I1122 03:18:54.114138 4952 scope.go:117] "RemoveContainer" containerID="c62de209f5fa2e90b19c82abd19c8149e1e62f62e1f8a503c1b20282bdd0e33a" Nov 22 03:18:54 crc kubenswrapper[4952]: I1122 03:18:54.175863 4952 scope.go:117] "RemoveContainer" containerID="3098132d9c8457b6674d130e2e5a35484aad4dc2689d154cb196544ee64699f9" Nov 22 03:18:54 crc kubenswrapper[4952]: E1122 03:18:54.176422 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3098132d9c8457b6674d130e2e5a35484aad4dc2689d154cb196544ee64699f9\": container with ID starting with 3098132d9c8457b6674d130e2e5a35484aad4dc2689d154cb196544ee64699f9 not found: ID does not exist" containerID="3098132d9c8457b6674d130e2e5a35484aad4dc2689d154cb196544ee64699f9" Nov 22 03:18:54 crc kubenswrapper[4952]: I1122 03:18:54.176489 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3098132d9c8457b6674d130e2e5a35484aad4dc2689d154cb196544ee64699f9"} err="failed to get container status \"3098132d9c8457b6674d130e2e5a35484aad4dc2689d154cb196544ee64699f9\": rpc error: code = NotFound desc = could not find container \"3098132d9c8457b6674d130e2e5a35484aad4dc2689d154cb196544ee64699f9\": container with ID starting with 3098132d9c8457b6674d130e2e5a35484aad4dc2689d154cb196544ee64699f9 not found: ID does not exist" Nov 22 03:18:54 crc kubenswrapper[4952]: I1122 03:18:54.176526 4952 scope.go:117] "RemoveContainer" containerID="b06dc31dc56ec967318d8bd6d526b702cae9271a5ca541b037b51fc0ddaf34ec" Nov 22 03:18:54 crc kubenswrapper[4952]: E1122 03:18:54.177175 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b06dc31dc56ec967318d8bd6d526b702cae9271a5ca541b037b51fc0ddaf34ec\": container with ID starting with b06dc31dc56ec967318d8bd6d526b702cae9271a5ca541b037b51fc0ddaf34ec not found: ID does not exist" containerID="b06dc31dc56ec967318d8bd6d526b702cae9271a5ca541b037b51fc0ddaf34ec" Nov 22 03:18:54 crc kubenswrapper[4952]: I1122 03:18:54.177267 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06dc31dc56ec967318d8bd6d526b702cae9271a5ca541b037b51fc0ddaf34ec"} err="failed to get container status \"b06dc31dc56ec967318d8bd6d526b702cae9271a5ca541b037b51fc0ddaf34ec\": rpc error: code = NotFound desc = could not find container \"b06dc31dc56ec967318d8bd6d526b702cae9271a5ca541b037b51fc0ddaf34ec\": container with ID starting with b06dc31dc56ec967318d8bd6d526b702cae9271a5ca541b037b51fc0ddaf34ec not found: ID does not exist" Nov 22 03:18:54 crc kubenswrapper[4952]: I1122 03:18:54.177346 4952 scope.go:117] "RemoveContainer" containerID="c62de209f5fa2e90b19c82abd19c8149e1e62f62e1f8a503c1b20282bdd0e33a" Nov 22 03:18:54 crc kubenswrapper[4952]: E1122 03:18:54.177862 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c62de209f5fa2e90b19c82abd19c8149e1e62f62e1f8a503c1b20282bdd0e33a\": container with ID starting with c62de209f5fa2e90b19c82abd19c8149e1e62f62e1f8a503c1b20282bdd0e33a not found: ID does not exist" containerID="c62de209f5fa2e90b19c82abd19c8149e1e62f62e1f8a503c1b20282bdd0e33a" Nov 22 03:18:54 crc kubenswrapper[4952]: I1122 03:18:54.177909 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c62de209f5fa2e90b19c82abd19c8149e1e62f62e1f8a503c1b20282bdd0e33a"} err="failed to get container status \"c62de209f5fa2e90b19c82abd19c8149e1e62f62e1f8a503c1b20282bdd0e33a\": rpc error: code = NotFound desc = could not find container \"c62de209f5fa2e90b19c82abd19c8149e1e62f62e1f8a503c1b20282bdd0e33a\": container with ID starting with c62de209f5fa2e90b19c82abd19c8149e1e62f62e1f8a503c1b20282bdd0e33a not found: ID does not exist" Nov 22 03:18:54 crc kubenswrapper[4952]: I1122 03:18:54.551652 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfb36133-2b7d-41a6-b3b2-3c5b942903e0" path="/var/lib/kubelet/pods/bfb36133-2b7d-41a6-b3b2-3c5b942903e0/volumes" Nov 22 03:18:58 crc kubenswrapper[4952]: I1122 03:18:58.342377 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:18:58 crc kubenswrapper[4952]: I1122 03:18:58.343018 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:18:58 crc kubenswrapper[4952]: I1122 03:18:58.343112 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 03:18:58 crc kubenswrapper[4952]: I1122 03:18:58.344215 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861"} pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:18:58 crc kubenswrapper[4952]: I1122 03:18:58.344320 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" containerID="cri-o://9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" gracePeriod=600 Nov 22 03:18:58 crc kubenswrapper[4952]: E1122 03:18:58.487478 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:18:58 crc kubenswrapper[4952]: I1122 03:18:58.616153 4952 scope.go:117] "RemoveContainer" containerID="2267fd930845ffd8d1a14b5f061e1287e841e0bc24750f7570b3b8ed7410368f" Nov 22 03:18:58 crc kubenswrapper[4952]: I1122 03:18:58.641275 4952 scope.go:117] "RemoveContainer" containerID="375056e0277bb11196aae00a2a8b965456a69a78f4d67fe129dd5ebcb7ef9e72" Nov 22 03:18:58 crc kubenswrapper[4952]: I1122 03:18:58.660988 4952 scope.go:117] "RemoveContainer" containerID="5d065045115d8c98abe0ff39c52373d088b79adbca15c5dedd4fb02191dd25cd" Nov 22 03:18:58 crc kubenswrapper[4952]: I1122 03:18:58.692234 4952 scope.go:117] "RemoveContainer" containerID="842c96526b437073cef231b947fb269736331f7dee1711886023affda9c4b430" Nov 22 03:18:58 crc kubenswrapper[4952]: I1122 03:18:58.714398 4952 scope.go:117] "RemoveContainer" containerID="c7f50c7374868506e18b188906a8af5cdaa9fec349ec6bd867a26c80369bcb76" Nov 22 03:18:58 crc kubenswrapper[4952]: I1122 03:18:58.736931 4952 scope.go:117] "RemoveContainer" containerID="6cc40601b5d19bf9ec48c035f909fb3b9807595738ade2d1a024d8e2dbe998a9" Nov 22 03:18:59 crc kubenswrapper[4952]: I1122 03:18:59.120743 4952 generic.go:334] "Generic (PLEG): container finished" podID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" exitCode=0 Nov 22 03:18:59 crc kubenswrapper[4952]: I1122 03:18:59.120845 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerDied","Data":"9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861"} Nov 22 03:18:59 crc kubenswrapper[4952]: I1122 03:18:59.120948 4952 scope.go:117] "RemoveContainer" containerID="15846c38005a8395c19c26d63eb9f008cd0288cc544d3ca54c338b089d4cf1e5" Nov 22 03:18:59 crc kubenswrapper[4952]: I1122 03:18:59.124356 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:18:59 crc kubenswrapper[4952]: E1122 03:18:59.124740 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:19:10 crc kubenswrapper[4952]: I1122 03:19:10.532240 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:19:10 crc kubenswrapper[4952]: E1122 03:19:10.533751 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:19:22 crc kubenswrapper[4952]: I1122 03:19:22.532844 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:19:22 crc kubenswrapper[4952]: E1122 03:19:22.534769 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:19:36 crc kubenswrapper[4952]: I1122 03:19:36.534886 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:19:36 crc kubenswrapper[4952]: E1122 03:19:36.537059 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:19:48 crc kubenswrapper[4952]: I1122 03:19:48.531636 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:19:48 crc kubenswrapper[4952]: E1122 03:19:48.532381 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:19:54 crc kubenswrapper[4952]: I1122 03:19:54.069359 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5e82-account-create-c5txw"] Nov 22 03:19:54 crc kubenswrapper[4952]: I1122 03:19:54.084353 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-bp7sw"] Nov 22 03:19:54 crc kubenswrapper[4952]: I1122 03:19:54.095379 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5e82-account-create-c5txw"] Nov 22 03:19:54 crc kubenswrapper[4952]: I1122 03:19:54.104993 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-bp7sw"] Nov 22 03:19:54 crc kubenswrapper[4952]: I1122 03:19:54.547065 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="975b18ab-9e5c-4640-be72-1d2e76ebedda" path="/var/lib/kubelet/pods/975b18ab-9e5c-4640-be72-1d2e76ebedda/volumes" Nov 22 03:19:54 crc kubenswrapper[4952]: I1122 03:19:54.547953 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd474404-fc22-4fec-8b02-9c536b93d36e" path="/var/lib/kubelet/pods/fd474404-fc22-4fec-8b02-9c536b93d36e/volumes" Nov 22 03:19:55 crc kubenswrapper[4952]: I1122 03:19:55.053005 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-dc03-account-create-p47mp"] Nov 22 03:19:55 crc kubenswrapper[4952]: I1122 03:19:55.064819 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-dc03-account-create-p47mp"] Nov 22 03:19:55 crc kubenswrapper[4952]: I1122 03:19:55.073774 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-b5bx2"] Nov 22 03:19:55 crc kubenswrapper[4952]: I1122 03:19:55.081749 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-b5bx2"] Nov 22 03:19:55 crc kubenswrapper[4952]: I1122 03:19:55.088671 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bbee-account-create-zrsv8"] Nov 22 03:19:55 crc kubenswrapper[4952]: I1122 03:19:55.095280 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-jhbw8"] Nov 22 03:19:55 crc kubenswrapper[4952]: I1122 03:19:55.101824 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-jhbw8"] Nov 22 03:19:55 crc kubenswrapper[4952]: I1122 03:19:55.111955 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-bbee-account-create-zrsv8"] Nov 22 03:19:56 crc kubenswrapper[4952]: I1122 03:19:56.543957 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03ed0223-bfb6-490c-b39d-3f57968b5744" path="/var/lib/kubelet/pods/03ed0223-bfb6-490c-b39d-3f57968b5744/volumes" Nov 22 03:19:56 crc kubenswrapper[4952]: I1122 03:19:56.544968 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3187f587-bdb9-4e8e-a009-e08c4d420041" path="/var/lib/kubelet/pods/3187f587-bdb9-4e8e-a009-e08c4d420041/volumes" Nov 22 03:19:56 crc kubenswrapper[4952]: I1122 03:19:56.545586 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71f4d260-3157-440f-860e-f260bb4c6052" path="/var/lib/kubelet/pods/71f4d260-3157-440f-860e-f260bb4c6052/volumes" Nov 22 03:19:56 crc kubenswrapper[4952]: I1122 03:19:56.546123 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1ae67f4-66a0-4a73-80f2-eacbd1db165f" path="/var/lib/kubelet/pods/a1ae67f4-66a0-4a73-80f2-eacbd1db165f/volumes" Nov 22 03:19:58 crc kubenswrapper[4952]: I1122 03:19:58.828426 4952 scope.go:117] "RemoveContainer" containerID="736bab4512c8d69a580698ea5780ace6a0b015b183636a940eb865391b295209" Nov 22 03:19:58 crc kubenswrapper[4952]: I1122 03:19:58.910135 4952 scope.go:117] "RemoveContainer" containerID="57e8a748cf20c980f359b1c9a53bdeec79a9eee4f038d679f0e2354e0a4ecd25" Nov 22 03:19:58 crc kubenswrapper[4952]: I1122 03:19:58.932147 4952 scope.go:117] "RemoveContainer" containerID="bcf338e4137c3abe5a2700ddbd7809bd5d4eaf979ccda9323d21b8793e269db0" Nov 22 03:19:58 crc kubenswrapper[4952]: I1122 03:19:58.978696 4952 scope.go:117] "RemoveContainer" containerID="5c530a60887afa6c89f2bd60c9848ebc583dbbd794067c75f3cfc0f2b1f119f2" Nov 22 03:19:59 crc kubenswrapper[4952]: I1122 03:19:59.026834 4952 scope.go:117] "RemoveContainer" containerID="15ab4dd1f69cacb3c62533c10cbf7bcfe0163cc87e2a11360cdf975863cfa9c1" Nov 22 03:19:59 crc kubenswrapper[4952]: I1122 03:19:59.064417 4952 scope.go:117] "RemoveContainer" containerID="840c535c7277fd81b578cda6c382ef8911bde294f5c163063bd4151e65960143" Nov 22 03:19:59 crc kubenswrapper[4952]: I1122 03:19:59.107402 4952 scope.go:117] "RemoveContainer" containerID="07166c25dc216572bd6890b512a4a0240020f5e02f36b7887949de8071d6c9aa" Nov 22 03:19:59 crc kubenswrapper[4952]: I1122 03:19:59.143108 4952 scope.go:117] "RemoveContainer" containerID="99ae91e2f66a6c6e4655b1db7391481fdd2d51642ca0992f30d2b8bb833bef3c" Nov 22 03:19:59 crc kubenswrapper[4952]: I1122 03:19:59.831526 4952 generic.go:334] "Generic (PLEG): container finished" podID="460e80b9-651b-4dc8-a82e-f9a45d9f18ac" containerID="4a1a8716a112bd2735614675416540a91718d7bde91565c87bc377779d013126" exitCode=0 Nov 22 03:19:59 crc kubenswrapper[4952]: I1122 03:19:59.831680 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw" event={"ID":"460e80b9-651b-4dc8-a82e-f9a45d9f18ac","Type":"ContainerDied","Data":"4a1a8716a112bd2735614675416540a91718d7bde91565c87bc377779d013126"} Nov 22 03:20:00 crc kubenswrapper[4952]: I1122 03:20:00.532893 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:20:00 crc kubenswrapper[4952]: E1122 03:20:00.533360 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.315977 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.327587 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm9pf\" (UniqueName: \"kubernetes.io/projected/460e80b9-651b-4dc8-a82e-f9a45d9f18ac-kube-api-access-pm9pf\") pod \"460e80b9-651b-4dc8-a82e-f9a45d9f18ac\" (UID: \"460e80b9-651b-4dc8-a82e-f9a45d9f18ac\") " Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.327677 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/460e80b9-651b-4dc8-a82e-f9a45d9f18ac-inventory\") pod \"460e80b9-651b-4dc8-a82e-f9a45d9f18ac\" (UID: \"460e80b9-651b-4dc8-a82e-f9a45d9f18ac\") " Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.327705 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/460e80b9-651b-4dc8-a82e-f9a45d9f18ac-ssh-key\") pod \"460e80b9-651b-4dc8-a82e-f9a45d9f18ac\" (UID: \"460e80b9-651b-4dc8-a82e-f9a45d9f18ac\") " Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.343386 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/460e80b9-651b-4dc8-a82e-f9a45d9f18ac-kube-api-access-pm9pf" (OuterVolumeSpecName: "kube-api-access-pm9pf") pod "460e80b9-651b-4dc8-a82e-f9a45d9f18ac" (UID: "460e80b9-651b-4dc8-a82e-f9a45d9f18ac"). InnerVolumeSpecName "kube-api-access-pm9pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.386955 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/460e80b9-651b-4dc8-a82e-f9a45d9f18ac-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "460e80b9-651b-4dc8-a82e-f9a45d9f18ac" (UID: "460e80b9-651b-4dc8-a82e-f9a45d9f18ac"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.390421 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/460e80b9-651b-4dc8-a82e-f9a45d9f18ac-inventory" (OuterVolumeSpecName: "inventory") pod "460e80b9-651b-4dc8-a82e-f9a45d9f18ac" (UID: "460e80b9-651b-4dc8-a82e-f9a45d9f18ac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.429476 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm9pf\" (UniqueName: \"kubernetes.io/projected/460e80b9-651b-4dc8-a82e-f9a45d9f18ac-kube-api-access-pm9pf\") on node \"crc\" DevicePath \"\"" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.429509 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/460e80b9-651b-4dc8-a82e-f9a45d9f18ac-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.429519 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/460e80b9-651b-4dc8-a82e-f9a45d9f18ac-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.866782 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw" event={"ID":"460e80b9-651b-4dc8-a82e-f9a45d9f18ac","Type":"ContainerDied","Data":"64f73227af297c13c7ff641113dc1bc724e2ac492f1bb2e67db8afdfee08ec87"} Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.866960 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64f73227af297c13c7ff641113dc1bc724e2ac492f1bb2e67db8afdfee08ec87" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.866955 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.966136 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f"] Nov 22 03:20:01 crc kubenswrapper[4952]: E1122 03:20:01.967389 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb36133-2b7d-41a6-b3b2-3c5b942903e0" containerName="extract-utilities" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.967414 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb36133-2b7d-41a6-b3b2-3c5b942903e0" containerName="extract-utilities" Nov 22 03:20:01 crc kubenswrapper[4952]: E1122 03:20:01.967428 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb36133-2b7d-41a6-b3b2-3c5b942903e0" containerName="registry-server" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.967435 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb36133-2b7d-41a6-b3b2-3c5b942903e0" containerName="registry-server" Nov 22 03:20:01 crc kubenswrapper[4952]: E1122 03:20:01.967445 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb36133-2b7d-41a6-b3b2-3c5b942903e0" containerName="extract-content" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.967453 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb36133-2b7d-41a6-b3b2-3c5b942903e0" containerName="extract-content" Nov 22 03:20:01 crc kubenswrapper[4952]: E1122 03:20:01.967462 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="460e80b9-651b-4dc8-a82e-f9a45d9f18ac" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.967469 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="460e80b9-651b-4dc8-a82e-f9a45d9f18ac" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.967694 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="460e80b9-651b-4dc8-a82e-f9a45d9f18ac" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.967713 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb36133-2b7d-41a6-b3b2-3c5b942903e0" containerName="registry-server" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.968422 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.971320 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.972246 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.972246 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.972213 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:20:01 crc kubenswrapper[4952]: I1122 03:20:01.977503 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f"] Nov 22 03:20:02 crc kubenswrapper[4952]: I1122 03:20:02.140663 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c758bb2-46c5-4fbb-97de-ee24a3648250-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f\" (UID: \"2c758bb2-46c5-4fbb-97de-ee24a3648250\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f" Nov 22 03:20:02 crc kubenswrapper[4952]: I1122 03:20:02.140734 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7drk8\" (UniqueName: \"kubernetes.io/projected/2c758bb2-46c5-4fbb-97de-ee24a3648250-kube-api-access-7drk8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f\" (UID: \"2c758bb2-46c5-4fbb-97de-ee24a3648250\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f" Nov 22 03:20:02 crc kubenswrapper[4952]: I1122 03:20:02.140824 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c758bb2-46c5-4fbb-97de-ee24a3648250-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f\" (UID: \"2c758bb2-46c5-4fbb-97de-ee24a3648250\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f" Nov 22 03:20:02 crc kubenswrapper[4952]: I1122 03:20:02.244422 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c758bb2-46c5-4fbb-97de-ee24a3648250-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f\" (UID: \"2c758bb2-46c5-4fbb-97de-ee24a3648250\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f" Nov 22 03:20:02 crc kubenswrapper[4952]: I1122 03:20:02.244822 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7drk8\" (UniqueName: \"kubernetes.io/projected/2c758bb2-46c5-4fbb-97de-ee24a3648250-kube-api-access-7drk8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f\" (UID: \"2c758bb2-46c5-4fbb-97de-ee24a3648250\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f" Nov 22 03:20:02 crc kubenswrapper[4952]: I1122 03:20:02.245018 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c758bb2-46c5-4fbb-97de-ee24a3648250-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f\" (UID: \"2c758bb2-46c5-4fbb-97de-ee24a3648250\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f" Nov 22 03:20:02 crc kubenswrapper[4952]: I1122 03:20:02.250346 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c758bb2-46c5-4fbb-97de-ee24a3648250-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f\" (UID: \"2c758bb2-46c5-4fbb-97de-ee24a3648250\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f" Nov 22 03:20:02 crc kubenswrapper[4952]: I1122 03:20:02.270453 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c758bb2-46c5-4fbb-97de-ee24a3648250-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f\" (UID: \"2c758bb2-46c5-4fbb-97de-ee24a3648250\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f" Nov 22 03:20:02 crc kubenswrapper[4952]: I1122 03:20:02.275257 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7drk8\" (UniqueName: \"kubernetes.io/projected/2c758bb2-46c5-4fbb-97de-ee24a3648250-kube-api-access-7drk8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f\" (UID: \"2c758bb2-46c5-4fbb-97de-ee24a3648250\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f" Nov 22 03:20:02 crc kubenswrapper[4952]: I1122 03:20:02.285921 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f" Nov 22 03:20:02 crc kubenswrapper[4952]: I1122 03:20:02.681260 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f"] Nov 22 03:20:02 crc kubenswrapper[4952]: I1122 03:20:02.699826 4952 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 03:20:02 crc kubenswrapper[4952]: I1122 03:20:02.903766 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f" event={"ID":"2c758bb2-46c5-4fbb-97de-ee24a3648250","Type":"ContainerStarted","Data":"8ea6926617dba5f98c8487770221af59acf2a8d214e8ea5aea13c532c29d425d"} Nov 22 03:20:03 crc kubenswrapper[4952]: I1122 03:20:03.913500 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f" event={"ID":"2c758bb2-46c5-4fbb-97de-ee24a3648250","Type":"ContainerStarted","Data":"c4524bf3969497a964a75ec9b196ab3741ec07551dad1445d36d5bf2aa73e3b6"} Nov 22 03:20:03 crc kubenswrapper[4952]: I1122 03:20:03.940677 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f" podStartSLOduration=2.457456599 podStartE2EDuration="2.940656365s" podCreationTimestamp="2025-11-22 03:20:01 +0000 UTC" firstStartedPulling="2025-11-22 03:20:02.699520077 +0000 UTC m=+1567.005537350" lastFinishedPulling="2025-11-22 03:20:03.182719823 +0000 UTC m=+1567.488737116" observedRunningTime="2025-11-22 03:20:03.93399827 +0000 UTC m=+1568.240015573" watchObservedRunningTime="2025-11-22 03:20:03.940656365 +0000 UTC m=+1568.246673639" Nov 22 03:20:08 crc kubenswrapper[4952]: I1122 03:20:08.966888 4952 generic.go:334] "Generic (PLEG): container finished" podID="2c758bb2-46c5-4fbb-97de-ee24a3648250" containerID="c4524bf3969497a964a75ec9b196ab3741ec07551dad1445d36d5bf2aa73e3b6" exitCode=0 Nov 22 03:20:08 crc kubenswrapper[4952]: I1122 03:20:08.966993 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f" event={"ID":"2c758bb2-46c5-4fbb-97de-ee24a3648250","Type":"ContainerDied","Data":"c4524bf3969497a964a75ec9b196ab3741ec07551dad1445d36d5bf2aa73e3b6"} Nov 22 03:20:10 crc kubenswrapper[4952]: I1122 03:20:10.497084 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f" Nov 22 03:20:10 crc kubenswrapper[4952]: I1122 03:20:10.546868 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c758bb2-46c5-4fbb-97de-ee24a3648250-ssh-key\") pod \"2c758bb2-46c5-4fbb-97de-ee24a3648250\" (UID: \"2c758bb2-46c5-4fbb-97de-ee24a3648250\") " Nov 22 03:20:10 crc kubenswrapper[4952]: I1122 03:20:10.546955 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c758bb2-46c5-4fbb-97de-ee24a3648250-inventory\") pod \"2c758bb2-46c5-4fbb-97de-ee24a3648250\" (UID: \"2c758bb2-46c5-4fbb-97de-ee24a3648250\") " Nov 22 03:20:10 crc kubenswrapper[4952]: I1122 03:20:10.547060 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7drk8\" (UniqueName: \"kubernetes.io/projected/2c758bb2-46c5-4fbb-97de-ee24a3648250-kube-api-access-7drk8\") pod \"2c758bb2-46c5-4fbb-97de-ee24a3648250\" (UID: \"2c758bb2-46c5-4fbb-97de-ee24a3648250\") " Nov 22 03:20:10 crc kubenswrapper[4952]: I1122 03:20:10.553223 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c758bb2-46c5-4fbb-97de-ee24a3648250-kube-api-access-7drk8" (OuterVolumeSpecName: "kube-api-access-7drk8") pod "2c758bb2-46c5-4fbb-97de-ee24a3648250" (UID: "2c758bb2-46c5-4fbb-97de-ee24a3648250"). InnerVolumeSpecName "kube-api-access-7drk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:20:10 crc kubenswrapper[4952]: I1122 03:20:10.578566 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c758bb2-46c5-4fbb-97de-ee24a3648250-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2c758bb2-46c5-4fbb-97de-ee24a3648250" (UID: "2c758bb2-46c5-4fbb-97de-ee24a3648250"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:20:10 crc kubenswrapper[4952]: I1122 03:20:10.584230 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c758bb2-46c5-4fbb-97de-ee24a3648250-inventory" (OuterVolumeSpecName: "inventory") pod "2c758bb2-46c5-4fbb-97de-ee24a3648250" (UID: "2c758bb2-46c5-4fbb-97de-ee24a3648250"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:20:10 crc kubenswrapper[4952]: I1122 03:20:10.650169 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7drk8\" (UniqueName: \"kubernetes.io/projected/2c758bb2-46c5-4fbb-97de-ee24a3648250-kube-api-access-7drk8\") on node \"crc\" DevicePath \"\"" Nov 22 03:20:10 crc kubenswrapper[4952]: I1122 03:20:10.650206 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c758bb2-46c5-4fbb-97de-ee24a3648250-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:20:10 crc kubenswrapper[4952]: I1122 03:20:10.650218 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c758bb2-46c5-4fbb-97de-ee24a3648250-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:20:10 crc kubenswrapper[4952]: I1122 03:20:10.989082 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f" event={"ID":"2c758bb2-46c5-4fbb-97de-ee24a3648250","Type":"ContainerDied","Data":"8ea6926617dba5f98c8487770221af59acf2a8d214e8ea5aea13c532c29d425d"} Nov 22 03:20:10 crc kubenswrapper[4952]: I1122 03:20:10.989140 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ea6926617dba5f98c8487770221af59acf2a8d214e8ea5aea13c532c29d425d" Nov 22 03:20:10 crc kubenswrapper[4952]: I1122 03:20:10.989192 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f" Nov 22 03:20:11 crc kubenswrapper[4952]: I1122 03:20:11.087711 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff"] Nov 22 03:20:11 crc kubenswrapper[4952]: E1122 03:20:11.088367 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c758bb2-46c5-4fbb-97de-ee24a3648250" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:20:11 crc kubenswrapper[4952]: I1122 03:20:11.088398 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c758bb2-46c5-4fbb-97de-ee24a3648250" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:20:11 crc kubenswrapper[4952]: I1122 03:20:11.088763 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c758bb2-46c5-4fbb-97de-ee24a3648250" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:20:11 crc kubenswrapper[4952]: I1122 03:20:11.090832 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff" Nov 22 03:20:11 crc kubenswrapper[4952]: I1122 03:20:11.093237 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:20:11 crc kubenswrapper[4952]: I1122 03:20:11.094726 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:20:11 crc kubenswrapper[4952]: I1122 03:20:11.094799 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:20:11 crc kubenswrapper[4952]: I1122 03:20:11.097741 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff"] Nov 22 03:20:11 crc kubenswrapper[4952]: I1122 03:20:11.100221 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:20:11 crc kubenswrapper[4952]: I1122 03:20:11.163920 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c8715b0-7a1d-465a-9fb6-8024e98f6047-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hkzff\" (UID: \"9c8715b0-7a1d-465a-9fb6-8024e98f6047\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff" Nov 22 03:20:11 crc kubenswrapper[4952]: I1122 03:20:11.164560 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mgzg\" (UniqueName: \"kubernetes.io/projected/9c8715b0-7a1d-465a-9fb6-8024e98f6047-kube-api-access-8mgzg\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hkzff\" (UID: \"9c8715b0-7a1d-465a-9fb6-8024e98f6047\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff" Nov 22 03:20:11 crc kubenswrapper[4952]: I1122 03:20:11.164780 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c8715b0-7a1d-465a-9fb6-8024e98f6047-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hkzff\" (UID: \"9c8715b0-7a1d-465a-9fb6-8024e98f6047\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff" Nov 22 03:20:11 crc kubenswrapper[4952]: I1122 03:20:11.266523 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mgzg\" (UniqueName: \"kubernetes.io/projected/9c8715b0-7a1d-465a-9fb6-8024e98f6047-kube-api-access-8mgzg\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hkzff\" (UID: \"9c8715b0-7a1d-465a-9fb6-8024e98f6047\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff" Nov 22 03:20:11 crc kubenswrapper[4952]: I1122 03:20:11.266669 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c8715b0-7a1d-465a-9fb6-8024e98f6047-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hkzff\" (UID: \"9c8715b0-7a1d-465a-9fb6-8024e98f6047\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff" Nov 22 03:20:11 crc kubenswrapper[4952]: I1122 03:20:11.266756 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c8715b0-7a1d-465a-9fb6-8024e98f6047-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hkzff\" (UID: \"9c8715b0-7a1d-465a-9fb6-8024e98f6047\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff" Nov 22 03:20:11 crc kubenswrapper[4952]: I1122 03:20:11.277283 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c8715b0-7a1d-465a-9fb6-8024e98f6047-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hkzff\" (UID: \"9c8715b0-7a1d-465a-9fb6-8024e98f6047\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff" Nov 22 03:20:11 crc kubenswrapper[4952]: I1122 03:20:11.289340 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c8715b0-7a1d-465a-9fb6-8024e98f6047-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hkzff\" (UID: \"9c8715b0-7a1d-465a-9fb6-8024e98f6047\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff" Nov 22 03:20:11 crc kubenswrapper[4952]: I1122 03:20:11.291792 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mgzg\" (UniqueName: \"kubernetes.io/projected/9c8715b0-7a1d-465a-9fb6-8024e98f6047-kube-api-access-8mgzg\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hkzff\" (UID: \"9c8715b0-7a1d-465a-9fb6-8024e98f6047\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff" Nov 22 03:20:11 crc kubenswrapper[4952]: I1122 03:20:11.424789 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff" Nov 22 03:20:11 crc kubenswrapper[4952]: I1122 03:20:11.868907 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff"] Nov 22 03:20:12 crc kubenswrapper[4952]: I1122 03:20:12.001749 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff" event={"ID":"9c8715b0-7a1d-465a-9fb6-8024e98f6047","Type":"ContainerStarted","Data":"73d57c64414729748153913a6260362cde746ae787a3de93817fb5430f392dd6"} Nov 22 03:20:13 crc kubenswrapper[4952]: I1122 03:20:13.019996 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff" event={"ID":"9c8715b0-7a1d-465a-9fb6-8024e98f6047","Type":"ContainerStarted","Data":"08481c41dae352c3095509139b1f03e4c6e0e8624a9014dc976c7c698cc0ea65"} Nov 22 03:20:13 crc kubenswrapper[4952]: I1122 03:20:13.061313 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff" podStartSLOduration=1.6503631539999999 podStartE2EDuration="2.061287397s" podCreationTimestamp="2025-11-22 03:20:11 +0000 UTC" firstStartedPulling="2025-11-22 03:20:11.883395773 +0000 UTC m=+1576.189413056" lastFinishedPulling="2025-11-22 03:20:12.294319996 +0000 UTC m=+1576.600337299" observedRunningTime="2025-11-22 03:20:13.047236255 +0000 UTC m=+1577.353253538" watchObservedRunningTime="2025-11-22 03:20:13.061287397 +0000 UTC m=+1577.367304690" Nov 22 03:20:13 crc kubenswrapper[4952]: I1122 03:20:13.531895 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:20:13 crc kubenswrapper[4952]: E1122 03:20:13.532415 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:20:19 crc kubenswrapper[4952]: I1122 03:20:19.050012 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-hcq9n"] Nov 22 03:20:19 crc kubenswrapper[4952]: I1122 03:20:19.066434 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-hcq9n"] Nov 22 03:20:20 crc kubenswrapper[4952]: I1122 03:20:20.545126 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dceab59-b76c-4c04-b00f-d81a39fd90ab" path="/var/lib/kubelet/pods/7dceab59-b76c-4c04-b00f-d81a39fd90ab/volumes" Nov 22 03:20:22 crc kubenswrapper[4952]: I1122 03:20:22.044316 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-r69xs"] Nov 22 03:20:22 crc kubenswrapper[4952]: I1122 03:20:22.061919 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-fjb79"] Nov 22 03:20:22 crc kubenswrapper[4952]: I1122 03:20:22.072003 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-fdrzh"] Nov 22 03:20:22 crc kubenswrapper[4952]: I1122 03:20:22.091812 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-20da-account-create-hffxc"] Nov 22 03:20:22 crc kubenswrapper[4952]: I1122 03:20:22.099925 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-r69xs"] Nov 22 03:20:22 crc kubenswrapper[4952]: I1122 03:20:22.107741 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-fdrzh"] Nov 22 03:20:22 crc kubenswrapper[4952]: I1122 03:20:22.115377 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-fjb79"] Nov 22 03:20:22 crc kubenswrapper[4952]: I1122 03:20:22.122842 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-20da-account-create-hffxc"] Nov 22 03:20:22 crc kubenswrapper[4952]: I1122 03:20:22.130376 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c721-account-create-kckf7"] Nov 22 03:20:22 crc kubenswrapper[4952]: I1122 03:20:22.140862 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c721-account-create-kckf7"] Nov 22 03:20:22 crc kubenswrapper[4952]: I1122 03:20:22.148456 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-2d41-account-create-xj9f4"] Nov 22 03:20:22 crc kubenswrapper[4952]: I1122 03:20:22.156130 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-2d41-account-create-xj9f4"] Nov 22 03:20:22 crc kubenswrapper[4952]: I1122 03:20:22.551391 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02579246-918f-41d2-b71c-661abcdb0072" path="/var/lib/kubelet/pods/02579246-918f-41d2-b71c-661abcdb0072/volumes" Nov 22 03:20:22 crc kubenswrapper[4952]: I1122 03:20:22.552397 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c6ba510-eab8-457f-9103-2f49b46115da" path="/var/lib/kubelet/pods/5c6ba510-eab8-457f-9103-2f49b46115da/volumes" Nov 22 03:20:22 crc kubenswrapper[4952]: I1122 03:20:22.553203 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aea8e22e-d027-4c7e-805d-20ba08d5a87d" path="/var/lib/kubelet/pods/aea8e22e-d027-4c7e-805d-20ba08d5a87d/volumes" Nov 22 03:20:22 crc kubenswrapper[4952]: I1122 03:20:22.553981 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afad0208-0c5d-49dd-ac0b-54871b694e7d" path="/var/lib/kubelet/pods/afad0208-0c5d-49dd-ac0b-54871b694e7d/volumes" Nov 22 03:20:22 crc kubenswrapper[4952]: I1122 03:20:22.555410 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf2f79bb-54b5-45f1-a95e-9923eefb464d" path="/var/lib/kubelet/pods/bf2f79bb-54b5-45f1-a95e-9923eefb464d/volumes" Nov 22 03:20:22 crc kubenswrapper[4952]: I1122 03:20:22.556344 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ded3b067-1f44-401e-afd4-d10955e87f52" path="/var/lib/kubelet/pods/ded3b067-1f44-401e-afd4-d10955e87f52/volumes" Nov 22 03:20:26 crc kubenswrapper[4952]: I1122 03:20:26.538878 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:20:26 crc kubenswrapper[4952]: E1122 03:20:26.540065 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:20:32 crc kubenswrapper[4952]: I1122 03:20:32.044849 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-pq7nm"] Nov 22 03:20:32 crc kubenswrapper[4952]: I1122 03:20:32.073156 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-pq7nm"] Nov 22 03:20:32 crc kubenswrapper[4952]: I1122 03:20:32.545987 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5725796-1375-41a7-a8d6-80035aabc3d1" path="/var/lib/kubelet/pods/c5725796-1375-41a7-a8d6-80035aabc3d1/volumes" Nov 22 03:20:38 crc kubenswrapper[4952]: I1122 03:20:38.531651 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:20:38 crc kubenswrapper[4952]: E1122 03:20:38.532753 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:20:50 crc kubenswrapper[4952]: I1122 03:20:50.532637 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:20:50 crc kubenswrapper[4952]: E1122 03:20:50.533663 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:20:54 crc kubenswrapper[4952]: I1122 03:20:54.450042 4952 generic.go:334] "Generic (PLEG): container finished" podID="9c8715b0-7a1d-465a-9fb6-8024e98f6047" containerID="08481c41dae352c3095509139b1f03e4c6e0e8624a9014dc976c7c698cc0ea65" exitCode=0 Nov 22 03:20:54 crc kubenswrapper[4952]: I1122 03:20:54.450125 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff" event={"ID":"9c8715b0-7a1d-465a-9fb6-8024e98f6047","Type":"ContainerDied","Data":"08481c41dae352c3095509139b1f03e4c6e0e8624a9014dc976c7c698cc0ea65"} Nov 22 03:20:55 crc kubenswrapper[4952]: I1122 03:20:55.914731 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.014736 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c8715b0-7a1d-465a-9fb6-8024e98f6047-ssh-key\") pod \"9c8715b0-7a1d-465a-9fb6-8024e98f6047\" (UID: \"9c8715b0-7a1d-465a-9fb6-8024e98f6047\") " Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.014803 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mgzg\" (UniqueName: \"kubernetes.io/projected/9c8715b0-7a1d-465a-9fb6-8024e98f6047-kube-api-access-8mgzg\") pod \"9c8715b0-7a1d-465a-9fb6-8024e98f6047\" (UID: \"9c8715b0-7a1d-465a-9fb6-8024e98f6047\") " Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.014838 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c8715b0-7a1d-465a-9fb6-8024e98f6047-inventory\") pod \"9c8715b0-7a1d-465a-9fb6-8024e98f6047\" (UID: \"9c8715b0-7a1d-465a-9fb6-8024e98f6047\") " Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.022762 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c8715b0-7a1d-465a-9fb6-8024e98f6047-kube-api-access-8mgzg" (OuterVolumeSpecName: "kube-api-access-8mgzg") pod "9c8715b0-7a1d-465a-9fb6-8024e98f6047" (UID: "9c8715b0-7a1d-465a-9fb6-8024e98f6047"). InnerVolumeSpecName "kube-api-access-8mgzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.041378 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c8715b0-7a1d-465a-9fb6-8024e98f6047-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9c8715b0-7a1d-465a-9fb6-8024e98f6047" (UID: "9c8715b0-7a1d-465a-9fb6-8024e98f6047"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.045701 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c8715b0-7a1d-465a-9fb6-8024e98f6047-inventory" (OuterVolumeSpecName: "inventory") pod "9c8715b0-7a1d-465a-9fb6-8024e98f6047" (UID: "9c8715b0-7a1d-465a-9fb6-8024e98f6047"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.117405 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c8715b0-7a1d-465a-9fb6-8024e98f6047-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.117439 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mgzg\" (UniqueName: \"kubernetes.io/projected/9c8715b0-7a1d-465a-9fb6-8024e98f6047-kube-api-access-8mgzg\") on node \"crc\" DevicePath \"\"" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.117454 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c8715b0-7a1d-465a-9fb6-8024e98f6047-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.473696 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff" event={"ID":"9c8715b0-7a1d-465a-9fb6-8024e98f6047","Type":"ContainerDied","Data":"73d57c64414729748153913a6260362cde746ae787a3de93817fb5430f392dd6"} Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.473780 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73d57c64414729748153913a6260362cde746ae787a3de93817fb5430f392dd6" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.473865 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.598285 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp"] Nov 22 03:20:56 crc kubenswrapper[4952]: E1122 03:20:56.599288 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c8715b0-7a1d-465a-9fb6-8024e98f6047" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.599317 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c8715b0-7a1d-465a-9fb6-8024e98f6047" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.599598 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c8715b0-7a1d-465a-9fb6-8024e98f6047" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.600432 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.606631 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.606725 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.609853 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp"] Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.611805 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.611849 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.734379 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1bfe334-0090-44fc-9132-ddd3bbc810b1-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp\" (UID: \"d1bfe334-0090-44fc-9132-ddd3bbc810b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.734561 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzjdt\" (UniqueName: \"kubernetes.io/projected/d1bfe334-0090-44fc-9132-ddd3bbc810b1-kube-api-access-tzjdt\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp\" (UID: \"d1bfe334-0090-44fc-9132-ddd3bbc810b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.734624 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1bfe334-0090-44fc-9132-ddd3bbc810b1-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp\" (UID: \"d1bfe334-0090-44fc-9132-ddd3bbc810b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.836367 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1bfe334-0090-44fc-9132-ddd3bbc810b1-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp\" (UID: \"d1bfe334-0090-44fc-9132-ddd3bbc810b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.836682 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1bfe334-0090-44fc-9132-ddd3bbc810b1-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp\" (UID: \"d1bfe334-0090-44fc-9132-ddd3bbc810b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.836804 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzjdt\" (UniqueName: \"kubernetes.io/projected/d1bfe334-0090-44fc-9132-ddd3bbc810b1-kube-api-access-tzjdt\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp\" (UID: \"d1bfe334-0090-44fc-9132-ddd3bbc810b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.840742 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1bfe334-0090-44fc-9132-ddd3bbc810b1-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp\" (UID: \"d1bfe334-0090-44fc-9132-ddd3bbc810b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.847672 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1bfe334-0090-44fc-9132-ddd3bbc810b1-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp\" (UID: \"d1bfe334-0090-44fc-9132-ddd3bbc810b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.859834 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzjdt\" (UniqueName: \"kubernetes.io/projected/d1bfe334-0090-44fc-9132-ddd3bbc810b1-kube-api-access-tzjdt\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp\" (UID: \"d1bfe334-0090-44fc-9132-ddd3bbc810b1\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp" Nov 22 03:20:56 crc kubenswrapper[4952]: I1122 03:20:56.918820 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp" Nov 22 03:20:57 crc kubenswrapper[4952]: I1122 03:20:57.601721 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp"] Nov 22 03:20:58 crc kubenswrapper[4952]: I1122 03:20:58.504059 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp" event={"ID":"d1bfe334-0090-44fc-9132-ddd3bbc810b1","Type":"ContainerStarted","Data":"8cdfc3bb0f74a692f48fa4d8c8374acaab432f9ea8dd492c50ef96503e0113e6"} Nov 22 03:20:58 crc kubenswrapper[4952]: I1122 03:20:58.504841 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp" event={"ID":"d1bfe334-0090-44fc-9132-ddd3bbc810b1","Type":"ContainerStarted","Data":"ff70e3605a0a1ccd5a4c0b3b35c5a9c828546facc4f2dc872703ef5ca023293c"} Nov 22 03:20:58 crc kubenswrapper[4952]: I1122 03:20:58.532397 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp" podStartSLOduration=2.124306935 podStartE2EDuration="2.532374987s" podCreationTimestamp="2025-11-22 03:20:56 +0000 UTC" firstStartedPulling="2025-11-22 03:20:57.602954054 +0000 UTC m=+1621.908971337" lastFinishedPulling="2025-11-22 03:20:58.011022086 +0000 UTC m=+1622.317039389" observedRunningTime="2025-11-22 03:20:58.528863003 +0000 UTC m=+1622.834880316" watchObservedRunningTime="2025-11-22 03:20:58.532374987 +0000 UTC m=+1622.838392260" Nov 22 03:20:59 crc kubenswrapper[4952]: I1122 03:20:59.315801 4952 scope.go:117] "RemoveContainer" containerID="13596658045d96daf3a1e90d777e675aa30a919581aecab26a5befffa0fe2382" Nov 22 03:20:59 crc kubenswrapper[4952]: I1122 03:20:59.350018 4952 scope.go:117] "RemoveContainer" containerID="0c47826f64a2845bfc4f9be9185c5c3ea688f1ef05f6146a05f9ad4b4f81a521" Nov 22 03:20:59 crc kubenswrapper[4952]: I1122 03:20:59.400824 4952 scope.go:117] "RemoveContainer" containerID="471bc27e97ddcd2067ad8223a8511faf4e2cb12a2f4c76914925268474cce5c9" Nov 22 03:20:59 crc kubenswrapper[4952]: I1122 03:20:59.426886 4952 scope.go:117] "RemoveContainer" containerID="03a4dfbceb005964f4300417e04cd67e4788e0f9197bd4f30f38f5db2b23bba9" Nov 22 03:20:59 crc kubenswrapper[4952]: I1122 03:20:59.492474 4952 scope.go:117] "RemoveContainer" containerID="7b78a37dc8ff70c046db65123698eb6cbc658df757704874ffe3e57665f7f5cf" Nov 22 03:20:59 crc kubenswrapper[4952]: I1122 03:20:59.542936 4952 scope.go:117] "RemoveContainer" containerID="d21104b47037277978c37d63f95e2bc633183b4c2c5ccb18fa643fdedfc48731" Nov 22 03:20:59 crc kubenswrapper[4952]: I1122 03:20:59.579153 4952 scope.go:117] "RemoveContainer" containerID="08ef6326853e97e711274c54bed1096fdec5a682022b7ce93b6e08421de32bbf" Nov 22 03:20:59 crc kubenswrapper[4952]: I1122 03:20:59.617422 4952 scope.go:117] "RemoveContainer" containerID="6312b4ccf8e25d15976035ae6511f6b0c688ed6da4a54454d97bbcd4f5bf3829" Nov 22 03:20:59 crc kubenswrapper[4952]: I1122 03:20:59.655822 4952 scope.go:117] "RemoveContainer" containerID="9b86a7d5b15cea6565ffd2161d277be665d71dcf56084346cf00d31accf620ad" Nov 22 03:20:59 crc kubenswrapper[4952]: I1122 03:20:59.693169 4952 scope.go:117] "RemoveContainer" containerID="7d85312346b2305a83c3a728c3df8638e883d70455e27d38da3360f76d803d0d" Nov 22 03:21:03 crc kubenswrapper[4952]: I1122 03:21:03.064511 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-9bjtf"] Nov 22 03:21:03 crc kubenswrapper[4952]: I1122 03:21:03.075742 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-clwjv"] Nov 22 03:21:03 crc kubenswrapper[4952]: I1122 03:21:03.089832 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-9bjtf"] Nov 22 03:21:03 crc kubenswrapper[4952]: I1122 03:21:03.104094 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-clwjv"] Nov 22 03:21:03 crc kubenswrapper[4952]: I1122 03:21:03.582268 4952 generic.go:334] "Generic (PLEG): container finished" podID="d1bfe334-0090-44fc-9132-ddd3bbc810b1" containerID="8cdfc3bb0f74a692f48fa4d8c8374acaab432f9ea8dd492c50ef96503e0113e6" exitCode=0 Nov 22 03:21:03 crc kubenswrapper[4952]: I1122 03:21:03.582351 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp" event={"ID":"d1bfe334-0090-44fc-9132-ddd3bbc810b1","Type":"ContainerDied","Data":"8cdfc3bb0f74a692f48fa4d8c8374acaab432f9ea8dd492c50ef96503e0113e6"} Nov 22 03:21:04 crc kubenswrapper[4952]: I1122 03:21:04.042031 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-s7fqh"] Nov 22 03:21:04 crc kubenswrapper[4952]: I1122 03:21:04.052190 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-s7fqh"] Nov 22 03:21:04 crc kubenswrapper[4952]: I1122 03:21:04.532437 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:21:04 crc kubenswrapper[4952]: E1122 03:21:04.532933 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:21:04 crc kubenswrapper[4952]: I1122 03:21:04.551980 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89e1e7cf-32af-45ef-b1bd-36fb741a1ffb" path="/var/lib/kubelet/pods/89e1e7cf-32af-45ef-b1bd-36fb741a1ffb/volumes" Nov 22 03:21:04 crc kubenswrapper[4952]: I1122 03:21:04.553419 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="935b4905-14f3-4505-ba9c-225833a9bdb4" path="/var/lib/kubelet/pods/935b4905-14f3-4505-ba9c-225833a9bdb4/volumes" Nov 22 03:21:04 crc kubenswrapper[4952]: I1122 03:21:04.554717 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe0628cb-3a91-4838-9eeb-dc8b9087969e" path="/var/lib/kubelet/pods/fe0628cb-3a91-4838-9eeb-dc8b9087969e/volumes" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.106496 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.237280 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1bfe334-0090-44fc-9132-ddd3bbc810b1-inventory\") pod \"d1bfe334-0090-44fc-9132-ddd3bbc810b1\" (UID: \"d1bfe334-0090-44fc-9132-ddd3bbc810b1\") " Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.237493 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzjdt\" (UniqueName: \"kubernetes.io/projected/d1bfe334-0090-44fc-9132-ddd3bbc810b1-kube-api-access-tzjdt\") pod \"d1bfe334-0090-44fc-9132-ddd3bbc810b1\" (UID: \"d1bfe334-0090-44fc-9132-ddd3bbc810b1\") " Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.238290 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1bfe334-0090-44fc-9132-ddd3bbc810b1-ssh-key\") pod \"d1bfe334-0090-44fc-9132-ddd3bbc810b1\" (UID: \"d1bfe334-0090-44fc-9132-ddd3bbc810b1\") " Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.243951 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1bfe334-0090-44fc-9132-ddd3bbc810b1-kube-api-access-tzjdt" (OuterVolumeSpecName: "kube-api-access-tzjdt") pod "d1bfe334-0090-44fc-9132-ddd3bbc810b1" (UID: "d1bfe334-0090-44fc-9132-ddd3bbc810b1"). InnerVolumeSpecName "kube-api-access-tzjdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.265253 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1bfe334-0090-44fc-9132-ddd3bbc810b1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d1bfe334-0090-44fc-9132-ddd3bbc810b1" (UID: "d1bfe334-0090-44fc-9132-ddd3bbc810b1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.270230 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1bfe334-0090-44fc-9132-ddd3bbc810b1-inventory" (OuterVolumeSpecName: "inventory") pod "d1bfe334-0090-44fc-9132-ddd3bbc810b1" (UID: "d1bfe334-0090-44fc-9132-ddd3bbc810b1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.340186 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1bfe334-0090-44fc-9132-ddd3bbc810b1-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.340224 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzjdt\" (UniqueName: \"kubernetes.io/projected/d1bfe334-0090-44fc-9132-ddd3bbc810b1-kube-api-access-tzjdt\") on node \"crc\" DevicePath \"\"" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.340240 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1bfe334-0090-44fc-9132-ddd3bbc810b1-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.616992 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp" event={"ID":"d1bfe334-0090-44fc-9132-ddd3bbc810b1","Type":"ContainerDied","Data":"ff70e3605a0a1ccd5a4c0b3b35c5a9c828546facc4f2dc872703ef5ca023293c"} Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.617040 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff70e3605a0a1ccd5a4c0b3b35c5a9c828546facc4f2dc872703ef5ca023293c" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.617062 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.695338 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6"] Nov 22 03:21:05 crc kubenswrapper[4952]: E1122 03:21:05.695724 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1bfe334-0090-44fc-9132-ddd3bbc810b1" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.695744 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1bfe334-0090-44fc-9132-ddd3bbc810b1" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.695949 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1bfe334-0090-44fc-9132-ddd3bbc810b1" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.696582 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.699353 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.704019 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.704159 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.704196 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.712562 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6"] Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.750175 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0516bf1-c92e-4982-b105-985052d0410a-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6\" (UID: \"d0516bf1-c92e-4982-b105-985052d0410a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.750248 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v4x4\" (UniqueName: \"kubernetes.io/projected/d0516bf1-c92e-4982-b105-985052d0410a-kube-api-access-5v4x4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6\" (UID: \"d0516bf1-c92e-4982-b105-985052d0410a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.750288 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0516bf1-c92e-4982-b105-985052d0410a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6\" (UID: \"d0516bf1-c92e-4982-b105-985052d0410a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.852355 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0516bf1-c92e-4982-b105-985052d0410a-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6\" (UID: \"d0516bf1-c92e-4982-b105-985052d0410a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.852883 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v4x4\" (UniqueName: \"kubernetes.io/projected/d0516bf1-c92e-4982-b105-985052d0410a-kube-api-access-5v4x4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6\" (UID: \"d0516bf1-c92e-4982-b105-985052d0410a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.852924 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0516bf1-c92e-4982-b105-985052d0410a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6\" (UID: \"d0516bf1-c92e-4982-b105-985052d0410a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.861946 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0516bf1-c92e-4982-b105-985052d0410a-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6\" (UID: \"d0516bf1-c92e-4982-b105-985052d0410a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.866412 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0516bf1-c92e-4982-b105-985052d0410a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6\" (UID: \"d0516bf1-c92e-4982-b105-985052d0410a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6" Nov 22 03:21:05 crc kubenswrapper[4952]: I1122 03:21:05.869965 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v4x4\" (UniqueName: \"kubernetes.io/projected/d0516bf1-c92e-4982-b105-985052d0410a-kube-api-access-5v4x4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6\" (UID: \"d0516bf1-c92e-4982-b105-985052d0410a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6" Nov 22 03:21:06 crc kubenswrapper[4952]: I1122 03:21:06.022267 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6" Nov 22 03:21:06 crc kubenswrapper[4952]: I1122 03:21:06.445980 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6"] Nov 22 03:21:06 crc kubenswrapper[4952]: I1122 03:21:06.637278 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6" event={"ID":"d0516bf1-c92e-4982-b105-985052d0410a","Type":"ContainerStarted","Data":"98bf9e8e55e57ce6b910bfc7979eeb07a586f988c94e380522e24dc0baf26618"} Nov 22 03:21:07 crc kubenswrapper[4952]: I1122 03:21:07.653010 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6" event={"ID":"d0516bf1-c92e-4982-b105-985052d0410a","Type":"ContainerStarted","Data":"4c76599daa9600b66f7c6959563c3afb6e57484199f10a45808022501551bbcf"} Nov 22 03:21:07 crc kubenswrapper[4952]: I1122 03:21:07.677320 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6" podStartSLOduration=1.85442354 podStartE2EDuration="2.677299083s" podCreationTimestamp="2025-11-22 03:21:05 +0000 UTC" firstStartedPulling="2025-11-22 03:21:06.45400693 +0000 UTC m=+1630.760024203" lastFinishedPulling="2025-11-22 03:21:07.276882443 +0000 UTC m=+1631.582899746" observedRunningTime="2025-11-22 03:21:07.672489015 +0000 UTC m=+1631.978506318" watchObservedRunningTime="2025-11-22 03:21:07.677299083 +0000 UTC m=+1631.983316356" Nov 22 03:21:17 crc kubenswrapper[4952]: I1122 03:21:17.530661 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:21:17 crc kubenswrapper[4952]: E1122 03:21:17.531970 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:21:20 crc kubenswrapper[4952]: I1122 03:21:20.067830 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-8zw6r"] Nov 22 03:21:20 crc kubenswrapper[4952]: I1122 03:21:20.078388 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-8zw6r"] Nov 22 03:21:20 crc kubenswrapper[4952]: I1122 03:21:20.547689 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90966d16-8b8d-461f-9bb9-827f0d8cd48b" path="/var/lib/kubelet/pods/90966d16-8b8d-461f-9bb9-827f0d8cd48b/volumes" Nov 22 03:21:22 crc kubenswrapper[4952]: I1122 03:21:22.039416 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-msxp9"] Nov 22 03:21:22 crc kubenswrapper[4952]: I1122 03:21:22.049043 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-msxp9"] Nov 22 03:21:22 crc kubenswrapper[4952]: I1122 03:21:22.560240 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd05df7b-eac0-4a1e-b957-506c8a4c56c4" path="/var/lib/kubelet/pods/dd05df7b-eac0-4a1e-b957-506c8a4c56c4/volumes" Nov 22 03:21:30 crc kubenswrapper[4952]: I1122 03:21:30.531145 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:21:30 crc kubenswrapper[4952]: E1122 03:21:30.532433 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:21:41 crc kubenswrapper[4952]: I1122 03:21:41.531720 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:21:41 crc kubenswrapper[4952]: E1122 03:21:41.532979 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:21:53 crc kubenswrapper[4952]: I1122 03:21:53.532147 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:21:53 crc kubenswrapper[4952]: E1122 03:21:53.533505 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:21:55 crc kubenswrapper[4952]: I1122 03:21:55.077180 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cf0c-account-create-24pqf"] Nov 22 03:21:55 crc kubenswrapper[4952]: I1122 03:21:55.095099 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-ac7e-account-create-p4htv"] Nov 22 03:21:55 crc kubenswrapper[4952]: I1122 03:21:55.104492 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2d2e-account-create-kxv6b"] Nov 22 03:21:55 crc kubenswrapper[4952]: I1122 03:21:55.112691 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-lf6jj"] Nov 22 03:21:55 crc kubenswrapper[4952]: I1122 03:21:55.119286 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-8ddxz"] Nov 22 03:21:55 crc kubenswrapper[4952]: I1122 03:21:55.125891 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-ac7e-account-create-p4htv"] Nov 22 03:21:55 crc kubenswrapper[4952]: I1122 03:21:55.132132 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2d2e-account-create-kxv6b"] Nov 22 03:21:55 crc kubenswrapper[4952]: I1122 03:21:55.138199 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-8qlsn"] Nov 22 03:21:55 crc kubenswrapper[4952]: I1122 03:21:55.144435 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cf0c-account-create-24pqf"] Nov 22 03:21:55 crc kubenswrapper[4952]: I1122 03:21:55.152951 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-8ddxz"] Nov 22 03:21:55 crc kubenswrapper[4952]: I1122 03:21:55.158883 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-lf6jj"] Nov 22 03:21:55 crc kubenswrapper[4952]: I1122 03:21:55.165358 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-8qlsn"] Nov 22 03:21:56 crc kubenswrapper[4952]: I1122 03:21:56.557687 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44db129c-1074-4f16-9c71-31a59ddd62a5" path="/var/lib/kubelet/pods/44db129c-1074-4f16-9c71-31a59ddd62a5/volumes" Nov 22 03:21:56 crc kubenswrapper[4952]: I1122 03:21:56.559256 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f" path="/var/lib/kubelet/pods/73e75a6d-eb3a-4e92-a13c-8ab87d2beb4f/volumes" Nov 22 03:21:56 crc kubenswrapper[4952]: I1122 03:21:56.560587 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74840955-d478-4dfb-a30d-9cff482b4e7e" path="/var/lib/kubelet/pods/74840955-d478-4dfb-a30d-9cff482b4e7e/volumes" Nov 22 03:21:56 crc kubenswrapper[4952]: I1122 03:21:56.562203 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca19638a-64bf-4f46-84d3-efd709c1593f" path="/var/lib/kubelet/pods/ca19638a-64bf-4f46-84d3-efd709c1593f/volumes" Nov 22 03:21:56 crc kubenswrapper[4952]: I1122 03:21:56.565600 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c" path="/var/lib/kubelet/pods/dd5e738e-b9f8-481e-ad1e-2dc51bd81b6c/volumes" Nov 22 03:21:56 crc kubenswrapper[4952]: I1122 03:21:56.566859 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5cc4a94-12fd-42a3-b2cb-7d04163ca285" path="/var/lib/kubelet/pods/f5cc4a94-12fd-42a3-b2cb-7d04163ca285/volumes" Nov 22 03:21:59 crc kubenswrapper[4952]: I1122 03:21:59.896364 4952 scope.go:117] "RemoveContainer" containerID="7c0217a1e066fe1a68299984058ad5be4a2e12998d7ae80458072bfc845d07d9" Nov 22 03:21:59 crc kubenswrapper[4952]: I1122 03:21:59.934176 4952 scope.go:117] "RemoveContainer" containerID="bd98db4a7b48722970b5456cc920c7608b8abb3b36fd5e7f18956ea06a0f6561" Nov 22 03:22:00 crc kubenswrapper[4952]: I1122 03:22:00.002762 4952 scope.go:117] "RemoveContainer" containerID="14cfdf8ac845bacbb9f2774b3d5b5f835bfcf4f296d65eb2dc4e4b43a11410f1" Nov 22 03:22:00 crc kubenswrapper[4952]: I1122 03:22:00.069782 4952 scope.go:117] "RemoveContainer" containerID="17299fdc2a7741cbafd0c25b49e6d0dd900bc3d7a5e74b1a793d5d0e1982d30e" Nov 22 03:22:00 crc kubenswrapper[4952]: I1122 03:22:00.109632 4952 scope.go:117] "RemoveContainer" containerID="a3f97f63c0bbb317d9967cdbf48ee6ca2db94efda45a33eb259fa189b6088438" Nov 22 03:22:00 crc kubenswrapper[4952]: I1122 03:22:00.156022 4952 scope.go:117] "RemoveContainer" containerID="0dedce8b1945cf3f0fb295895c6b546e1875621fd07d650c41cae1e91903a4e5" Nov 22 03:22:00 crc kubenswrapper[4952]: I1122 03:22:00.228756 4952 scope.go:117] "RemoveContainer" containerID="df095af4aaa28c2ce1bb66ab2fe7d4ee4861df62d886e6aa8ea343e5ff18893c" Nov 22 03:22:00 crc kubenswrapper[4952]: I1122 03:22:00.287345 4952 scope.go:117] "RemoveContainer" containerID="8c3835e24c1374eeee640debac1908efdf2a4d40b0a7aca1579f293751d88570" Nov 22 03:22:00 crc kubenswrapper[4952]: I1122 03:22:00.311717 4952 scope.go:117] "RemoveContainer" containerID="74143ebe0da99ad9e5b9d777f5bf17e6bc2205a641e56283d347e722d3a3aca0" Nov 22 03:22:00 crc kubenswrapper[4952]: I1122 03:22:00.341352 4952 scope.go:117] "RemoveContainer" containerID="4407dc190d3c77ff016b7c68de5c18baa7da4a3fa665d1b70b120dd9aefbcc71" Nov 22 03:22:00 crc kubenswrapper[4952]: I1122 03:22:00.366592 4952 scope.go:117] "RemoveContainer" containerID="e64e18e0ef8bb22854ad2fb7150a00e9045015c98fb158ea5bb96431a2d02b05" Nov 22 03:22:05 crc kubenswrapper[4952]: I1122 03:22:05.280619 4952 generic.go:334] "Generic (PLEG): container finished" podID="d0516bf1-c92e-4982-b105-985052d0410a" containerID="4c76599daa9600b66f7c6959563c3afb6e57484199f10a45808022501551bbcf" exitCode=0 Nov 22 03:22:05 crc kubenswrapper[4952]: I1122 03:22:05.280751 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6" event={"ID":"d0516bf1-c92e-4982-b105-985052d0410a","Type":"ContainerDied","Data":"4c76599daa9600b66f7c6959563c3afb6e57484199f10a45808022501551bbcf"} Nov 22 03:22:05 crc kubenswrapper[4952]: I1122 03:22:05.531335 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:22:05 crc kubenswrapper[4952]: E1122 03:22:05.531884 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:22:06 crc kubenswrapper[4952]: I1122 03:22:06.745212 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6" Nov 22 03:22:06 crc kubenswrapper[4952]: I1122 03:22:06.836189 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v4x4\" (UniqueName: \"kubernetes.io/projected/d0516bf1-c92e-4982-b105-985052d0410a-kube-api-access-5v4x4\") pod \"d0516bf1-c92e-4982-b105-985052d0410a\" (UID: \"d0516bf1-c92e-4982-b105-985052d0410a\") " Nov 22 03:22:06 crc kubenswrapper[4952]: I1122 03:22:06.836292 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0516bf1-c92e-4982-b105-985052d0410a-ssh-key\") pod \"d0516bf1-c92e-4982-b105-985052d0410a\" (UID: \"d0516bf1-c92e-4982-b105-985052d0410a\") " Nov 22 03:22:06 crc kubenswrapper[4952]: I1122 03:22:06.836636 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0516bf1-c92e-4982-b105-985052d0410a-inventory\") pod \"d0516bf1-c92e-4982-b105-985052d0410a\" (UID: \"d0516bf1-c92e-4982-b105-985052d0410a\") " Nov 22 03:22:06 crc kubenswrapper[4952]: I1122 03:22:06.853008 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0516bf1-c92e-4982-b105-985052d0410a-kube-api-access-5v4x4" (OuterVolumeSpecName: "kube-api-access-5v4x4") pod "d0516bf1-c92e-4982-b105-985052d0410a" (UID: "d0516bf1-c92e-4982-b105-985052d0410a"). InnerVolumeSpecName "kube-api-access-5v4x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:22:06 crc kubenswrapper[4952]: I1122 03:22:06.876865 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0516bf1-c92e-4982-b105-985052d0410a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d0516bf1-c92e-4982-b105-985052d0410a" (UID: "d0516bf1-c92e-4982-b105-985052d0410a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:22:06 crc kubenswrapper[4952]: I1122 03:22:06.883267 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0516bf1-c92e-4982-b105-985052d0410a-inventory" (OuterVolumeSpecName: "inventory") pod "d0516bf1-c92e-4982-b105-985052d0410a" (UID: "d0516bf1-c92e-4982-b105-985052d0410a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:22:06 crc kubenswrapper[4952]: I1122 03:22:06.939605 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v4x4\" (UniqueName: \"kubernetes.io/projected/d0516bf1-c92e-4982-b105-985052d0410a-kube-api-access-5v4x4\") on node \"crc\" DevicePath \"\"" Nov 22 03:22:06 crc kubenswrapper[4952]: I1122 03:22:06.939668 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0516bf1-c92e-4982-b105-985052d0410a-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:22:06 crc kubenswrapper[4952]: I1122 03:22:06.939690 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0516bf1-c92e-4982-b105-985052d0410a-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.304931 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6" event={"ID":"d0516bf1-c92e-4982-b105-985052d0410a","Type":"ContainerDied","Data":"98bf9e8e55e57ce6b910bfc7979eeb07a586f988c94e380522e24dc0baf26618"} Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.304987 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98bf9e8e55e57ce6b910bfc7979eeb07a586f988c94e380522e24dc0baf26618" Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.305095 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6" Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.423602 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-blkhx"] Nov 22 03:22:07 crc kubenswrapper[4952]: E1122 03:22:07.424177 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0516bf1-c92e-4982-b105-985052d0410a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.424208 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0516bf1-c92e-4982-b105-985052d0410a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.424500 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0516bf1-c92e-4982-b105-985052d0410a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.425338 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-blkhx" Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.428901 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.428924 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.428982 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.437329 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.453446 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-blkhx"] Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.552118 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cfc29e37-de29-439c-805d-c2f92e6bd117-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-blkhx\" (UID: \"cfc29e37-de29-439c-805d-c2f92e6bd117\") " pod="openstack/ssh-known-hosts-edpm-deployment-blkhx" Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.552197 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cfc29e37-de29-439c-805d-c2f92e6bd117-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-blkhx\" (UID: \"cfc29e37-de29-439c-805d-c2f92e6bd117\") " pod="openstack/ssh-known-hosts-edpm-deployment-blkhx" Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.552266 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lnbw\" (UniqueName: \"kubernetes.io/projected/cfc29e37-de29-439c-805d-c2f92e6bd117-kube-api-access-7lnbw\") pod \"ssh-known-hosts-edpm-deployment-blkhx\" (UID: \"cfc29e37-de29-439c-805d-c2f92e6bd117\") " pod="openstack/ssh-known-hosts-edpm-deployment-blkhx" Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.654208 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cfc29e37-de29-439c-805d-c2f92e6bd117-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-blkhx\" (UID: \"cfc29e37-de29-439c-805d-c2f92e6bd117\") " pod="openstack/ssh-known-hosts-edpm-deployment-blkhx" Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.655037 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cfc29e37-de29-439c-805d-c2f92e6bd117-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-blkhx\" (UID: \"cfc29e37-de29-439c-805d-c2f92e6bd117\") " pod="openstack/ssh-known-hosts-edpm-deployment-blkhx" Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.655100 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lnbw\" (UniqueName: \"kubernetes.io/projected/cfc29e37-de29-439c-805d-c2f92e6bd117-kube-api-access-7lnbw\") pod \"ssh-known-hosts-edpm-deployment-blkhx\" (UID: \"cfc29e37-de29-439c-805d-c2f92e6bd117\") " pod="openstack/ssh-known-hosts-edpm-deployment-blkhx" Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.662198 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cfc29e37-de29-439c-805d-c2f92e6bd117-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-blkhx\" (UID: \"cfc29e37-de29-439c-805d-c2f92e6bd117\") " pod="openstack/ssh-known-hosts-edpm-deployment-blkhx" Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.665887 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cfc29e37-de29-439c-805d-c2f92e6bd117-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-blkhx\" (UID: \"cfc29e37-de29-439c-805d-c2f92e6bd117\") " pod="openstack/ssh-known-hosts-edpm-deployment-blkhx" Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.674368 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lnbw\" (UniqueName: \"kubernetes.io/projected/cfc29e37-de29-439c-805d-c2f92e6bd117-kube-api-access-7lnbw\") pod \"ssh-known-hosts-edpm-deployment-blkhx\" (UID: \"cfc29e37-de29-439c-805d-c2f92e6bd117\") " pod="openstack/ssh-known-hosts-edpm-deployment-blkhx" Nov 22 03:22:07 crc kubenswrapper[4952]: I1122 03:22:07.752516 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-blkhx" Nov 22 03:22:08 crc kubenswrapper[4952]: I1122 03:22:08.313283 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-blkhx"] Nov 22 03:22:09 crc kubenswrapper[4952]: I1122 03:22:09.332319 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-blkhx" event={"ID":"cfc29e37-de29-439c-805d-c2f92e6bd117","Type":"ContainerStarted","Data":"99724f15c2d95a86da8115847f17b92deff344e43d6a05437e293e4b600f3e61"} Nov 22 03:22:09 crc kubenswrapper[4952]: I1122 03:22:09.332420 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-blkhx" event={"ID":"cfc29e37-de29-439c-805d-c2f92e6bd117","Type":"ContainerStarted","Data":"f8d8a741a0a380d1b08d801959a0385c5aaf3b6c15fa8ec5f6fbfae356297c7b"} Nov 22 03:22:09 crc kubenswrapper[4952]: I1122 03:22:09.363111 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-blkhx" podStartSLOduration=1.894462319 podStartE2EDuration="2.36308863s" podCreationTimestamp="2025-11-22 03:22:07 +0000 UTC" firstStartedPulling="2025-11-22 03:22:08.320330059 +0000 UTC m=+1692.626347372" lastFinishedPulling="2025-11-22 03:22:08.7889564 +0000 UTC m=+1693.094973683" observedRunningTime="2025-11-22 03:22:09.356913336 +0000 UTC m=+1693.662930639" watchObservedRunningTime="2025-11-22 03:22:09.36308863 +0000 UTC m=+1693.669105903" Nov 22 03:22:17 crc kubenswrapper[4952]: I1122 03:22:17.427471 4952 generic.go:334] "Generic (PLEG): container finished" podID="cfc29e37-de29-439c-805d-c2f92e6bd117" containerID="99724f15c2d95a86da8115847f17b92deff344e43d6a05437e293e4b600f3e61" exitCode=0 Nov 22 03:22:17 crc kubenswrapper[4952]: I1122 03:22:17.427674 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-blkhx" event={"ID":"cfc29e37-de29-439c-805d-c2f92e6bd117","Type":"ContainerDied","Data":"99724f15c2d95a86da8115847f17b92deff344e43d6a05437e293e4b600f3e61"} Nov 22 03:22:18 crc kubenswrapper[4952]: I1122 03:22:18.943023 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-blkhx" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.005242 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cfc29e37-de29-439c-805d-c2f92e6bd117-ssh-key-openstack-edpm-ipam\") pod \"cfc29e37-de29-439c-805d-c2f92e6bd117\" (UID: \"cfc29e37-de29-439c-805d-c2f92e6bd117\") " Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.005310 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cfc29e37-de29-439c-805d-c2f92e6bd117-inventory-0\") pod \"cfc29e37-de29-439c-805d-c2f92e6bd117\" (UID: \"cfc29e37-de29-439c-805d-c2f92e6bd117\") " Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.005773 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lnbw\" (UniqueName: \"kubernetes.io/projected/cfc29e37-de29-439c-805d-c2f92e6bd117-kube-api-access-7lnbw\") pod \"cfc29e37-de29-439c-805d-c2f92e6bd117\" (UID: \"cfc29e37-de29-439c-805d-c2f92e6bd117\") " Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.012646 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfc29e37-de29-439c-805d-c2f92e6bd117-kube-api-access-7lnbw" (OuterVolumeSpecName: "kube-api-access-7lnbw") pod "cfc29e37-de29-439c-805d-c2f92e6bd117" (UID: "cfc29e37-de29-439c-805d-c2f92e6bd117"). InnerVolumeSpecName "kube-api-access-7lnbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.032760 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc29e37-de29-439c-805d-c2f92e6bd117-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cfc29e37-de29-439c-805d-c2f92e6bd117" (UID: "cfc29e37-de29-439c-805d-c2f92e6bd117"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.035663 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc29e37-de29-439c-805d-c2f92e6bd117-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "cfc29e37-de29-439c-805d-c2f92e6bd117" (UID: "cfc29e37-de29-439c-805d-c2f92e6bd117"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.107994 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lnbw\" (UniqueName: \"kubernetes.io/projected/cfc29e37-de29-439c-805d-c2f92e6bd117-kube-api-access-7lnbw\") on node \"crc\" DevicePath \"\"" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.108025 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cfc29e37-de29-439c-805d-c2f92e6bd117-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.108035 4952 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cfc29e37-de29-439c-805d-c2f92e6bd117-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.455748 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-blkhx" event={"ID":"cfc29e37-de29-439c-805d-c2f92e6bd117","Type":"ContainerDied","Data":"f8d8a741a0a380d1b08d801959a0385c5aaf3b6c15fa8ec5f6fbfae356297c7b"} Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.455813 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8d8a741a0a380d1b08d801959a0385c5aaf3b6c15fa8ec5f6fbfae356297c7b" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.455852 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-blkhx" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.576845 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc"] Nov 22 03:22:19 crc kubenswrapper[4952]: E1122 03:22:19.577288 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc29e37-de29-439c-805d-c2f92e6bd117" containerName="ssh-known-hosts-edpm-deployment" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.577328 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc29e37-de29-439c-805d-c2f92e6bd117" containerName="ssh-known-hosts-edpm-deployment" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.578009 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc29e37-de29-439c-805d-c2f92e6bd117" containerName="ssh-known-hosts-edpm-deployment" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.579103 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.581212 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.583213 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.583435 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.583687 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.586677 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc"] Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.719456 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn88h\" (UniqueName: \"kubernetes.io/projected/f1c64327-794d-4559-9b2b-ad7b9ba81dd1-kube-api-access-mn88h\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gqqjc\" (UID: \"f1c64327-794d-4559-9b2b-ad7b9ba81dd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.720083 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1c64327-794d-4559-9b2b-ad7b9ba81dd1-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gqqjc\" (UID: \"f1c64327-794d-4559-9b2b-ad7b9ba81dd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.720134 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1c64327-794d-4559-9b2b-ad7b9ba81dd1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gqqjc\" (UID: \"f1c64327-794d-4559-9b2b-ad7b9ba81dd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.821370 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1c64327-794d-4559-9b2b-ad7b9ba81dd1-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gqqjc\" (UID: \"f1c64327-794d-4559-9b2b-ad7b9ba81dd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.821428 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1c64327-794d-4559-9b2b-ad7b9ba81dd1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gqqjc\" (UID: \"f1c64327-794d-4559-9b2b-ad7b9ba81dd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.821483 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn88h\" (UniqueName: \"kubernetes.io/projected/f1c64327-794d-4559-9b2b-ad7b9ba81dd1-kube-api-access-mn88h\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gqqjc\" (UID: \"f1c64327-794d-4559-9b2b-ad7b9ba81dd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.829970 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1c64327-794d-4559-9b2b-ad7b9ba81dd1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gqqjc\" (UID: \"f1c64327-794d-4559-9b2b-ad7b9ba81dd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.835234 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1c64327-794d-4559-9b2b-ad7b9ba81dd1-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gqqjc\" (UID: \"f1c64327-794d-4559-9b2b-ad7b9ba81dd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.849923 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn88h\" (UniqueName: \"kubernetes.io/projected/f1c64327-794d-4559-9b2b-ad7b9ba81dd1-kube-api-access-mn88h\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gqqjc\" (UID: \"f1c64327-794d-4559-9b2b-ad7b9ba81dd1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc" Nov 22 03:22:19 crc kubenswrapper[4952]: I1122 03:22:19.902369 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc" Nov 22 03:22:20 crc kubenswrapper[4952]: I1122 03:22:20.251187 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc"] Nov 22 03:22:20 crc kubenswrapper[4952]: I1122 03:22:20.466318 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc" event={"ID":"f1c64327-794d-4559-9b2b-ad7b9ba81dd1","Type":"ContainerStarted","Data":"10e975b7dcff4b49f3c40c5a5a0ae7731d3f0fa452feb6be9f4835f509239e3d"} Nov 22 03:22:20 crc kubenswrapper[4952]: I1122 03:22:20.532135 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:22:20 crc kubenswrapper[4952]: E1122 03:22:20.532416 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:22:21 crc kubenswrapper[4952]: I1122 03:22:21.482275 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc" event={"ID":"f1c64327-794d-4559-9b2b-ad7b9ba81dd1","Type":"ContainerStarted","Data":"c945fc663caf63f76f7a57f9177dbff81a91573f5f0306765bc1f13560431d77"} Nov 22 03:22:21 crc kubenswrapper[4952]: I1122 03:22:21.511831 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc" podStartSLOduration=2.082123278 podStartE2EDuration="2.511803103s" podCreationTimestamp="2025-11-22 03:22:19 +0000 UTC" firstStartedPulling="2025-11-22 03:22:20.244701728 +0000 UTC m=+1704.550719001" lastFinishedPulling="2025-11-22 03:22:20.674381513 +0000 UTC m=+1704.980398826" observedRunningTime="2025-11-22 03:22:21.505198049 +0000 UTC m=+1705.811215342" watchObservedRunningTime="2025-11-22 03:22:21.511803103 +0000 UTC m=+1705.817820416" Nov 22 03:22:28 crc kubenswrapper[4952]: I1122 03:22:28.059267 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rxtxb"] Nov 22 03:22:28 crc kubenswrapper[4952]: I1122 03:22:28.072435 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rxtxb"] Nov 22 03:22:28 crc kubenswrapper[4952]: I1122 03:22:28.547412 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dff8e549-6708-4b20-acc9-411cf736985a" path="/var/lib/kubelet/pods/dff8e549-6708-4b20-acc9-411cf736985a/volumes" Nov 22 03:22:30 crc kubenswrapper[4952]: I1122 03:22:30.596376 4952 generic.go:334] "Generic (PLEG): container finished" podID="f1c64327-794d-4559-9b2b-ad7b9ba81dd1" containerID="c945fc663caf63f76f7a57f9177dbff81a91573f5f0306765bc1f13560431d77" exitCode=0 Nov 22 03:22:30 crc kubenswrapper[4952]: I1122 03:22:30.596508 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc" event={"ID":"f1c64327-794d-4559-9b2b-ad7b9ba81dd1","Type":"ContainerDied","Data":"c945fc663caf63f76f7a57f9177dbff81a91573f5f0306765bc1f13560431d77"} Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.095113 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.195524 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1c64327-794d-4559-9b2b-ad7b9ba81dd1-inventory\") pod \"f1c64327-794d-4559-9b2b-ad7b9ba81dd1\" (UID: \"f1c64327-794d-4559-9b2b-ad7b9ba81dd1\") " Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.195605 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1c64327-794d-4559-9b2b-ad7b9ba81dd1-ssh-key\") pod \"f1c64327-794d-4559-9b2b-ad7b9ba81dd1\" (UID: \"f1c64327-794d-4559-9b2b-ad7b9ba81dd1\") " Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.195704 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn88h\" (UniqueName: \"kubernetes.io/projected/f1c64327-794d-4559-9b2b-ad7b9ba81dd1-kube-api-access-mn88h\") pod \"f1c64327-794d-4559-9b2b-ad7b9ba81dd1\" (UID: \"f1c64327-794d-4559-9b2b-ad7b9ba81dd1\") " Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.204919 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1c64327-794d-4559-9b2b-ad7b9ba81dd1-kube-api-access-mn88h" (OuterVolumeSpecName: "kube-api-access-mn88h") pod "f1c64327-794d-4559-9b2b-ad7b9ba81dd1" (UID: "f1c64327-794d-4559-9b2b-ad7b9ba81dd1"). InnerVolumeSpecName "kube-api-access-mn88h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.225502 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1c64327-794d-4559-9b2b-ad7b9ba81dd1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f1c64327-794d-4559-9b2b-ad7b9ba81dd1" (UID: "f1c64327-794d-4559-9b2b-ad7b9ba81dd1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.231681 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1c64327-794d-4559-9b2b-ad7b9ba81dd1-inventory" (OuterVolumeSpecName: "inventory") pod "f1c64327-794d-4559-9b2b-ad7b9ba81dd1" (UID: "f1c64327-794d-4559-9b2b-ad7b9ba81dd1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.298313 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1c64327-794d-4559-9b2b-ad7b9ba81dd1-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.298350 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1c64327-794d-4559-9b2b-ad7b9ba81dd1-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.298365 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn88h\" (UniqueName: \"kubernetes.io/projected/f1c64327-794d-4559-9b2b-ad7b9ba81dd1-kube-api-access-mn88h\") on node \"crc\" DevicePath \"\"" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.619508 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc" event={"ID":"f1c64327-794d-4559-9b2b-ad7b9ba81dd1","Type":"ContainerDied","Data":"10e975b7dcff4b49f3c40c5a5a0ae7731d3f0fa452feb6be9f4835f509239e3d"} Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.619585 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10e975b7dcff4b49f3c40c5a5a0ae7731d3f0fa452feb6be9f4835f509239e3d" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.619676 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.712463 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh"] Nov 22 03:22:32 crc kubenswrapper[4952]: E1122 03:22:32.713324 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1c64327-794d-4559-9b2b-ad7b9ba81dd1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.713372 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1c64327-794d-4559-9b2b-ad7b9ba81dd1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.713823 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1c64327-794d-4559-9b2b-ad7b9ba81dd1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.715619 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.718491 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.718890 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.719454 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.726409 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh"] Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.728327 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.810532 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5d7236f-3993-41b9-a1da-bcf22abf958d-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh\" (UID: \"a5d7236f-3993-41b9-a1da-bcf22abf958d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.810596 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95pnl\" (UniqueName: \"kubernetes.io/projected/a5d7236f-3993-41b9-a1da-bcf22abf958d-kube-api-access-95pnl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh\" (UID: \"a5d7236f-3993-41b9-a1da-bcf22abf958d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.810683 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5d7236f-3993-41b9-a1da-bcf22abf958d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh\" (UID: \"a5d7236f-3993-41b9-a1da-bcf22abf958d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.912890 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5d7236f-3993-41b9-a1da-bcf22abf958d-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh\" (UID: \"a5d7236f-3993-41b9-a1da-bcf22abf958d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.912992 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95pnl\" (UniqueName: \"kubernetes.io/projected/a5d7236f-3993-41b9-a1da-bcf22abf958d-kube-api-access-95pnl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh\" (UID: \"a5d7236f-3993-41b9-a1da-bcf22abf958d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.913092 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5d7236f-3993-41b9-a1da-bcf22abf958d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh\" (UID: \"a5d7236f-3993-41b9-a1da-bcf22abf958d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.919894 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5d7236f-3993-41b9-a1da-bcf22abf958d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh\" (UID: \"a5d7236f-3993-41b9-a1da-bcf22abf958d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.921307 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5d7236f-3993-41b9-a1da-bcf22abf958d-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh\" (UID: \"a5d7236f-3993-41b9-a1da-bcf22abf958d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh" Nov 22 03:22:32 crc kubenswrapper[4952]: I1122 03:22:32.950041 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95pnl\" (UniqueName: \"kubernetes.io/projected/a5d7236f-3993-41b9-a1da-bcf22abf958d-kube-api-access-95pnl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh\" (UID: \"a5d7236f-3993-41b9-a1da-bcf22abf958d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh" Nov 22 03:22:33 crc kubenswrapper[4952]: I1122 03:22:33.038706 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh" Nov 22 03:22:33 crc kubenswrapper[4952]: I1122 03:22:33.713188 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh"] Nov 22 03:22:34 crc kubenswrapper[4952]: I1122 03:22:34.643341 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh" event={"ID":"a5d7236f-3993-41b9-a1da-bcf22abf958d","Type":"ContainerStarted","Data":"d3cbd193f65de66a86761daa986d358d14e98b316e86f9804533a2745e5e075e"} Nov 22 03:22:35 crc kubenswrapper[4952]: I1122 03:22:35.531854 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:22:35 crc kubenswrapper[4952]: E1122 03:22:35.532878 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:22:35 crc kubenswrapper[4952]: I1122 03:22:35.662809 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh" event={"ID":"a5d7236f-3993-41b9-a1da-bcf22abf958d","Type":"ContainerStarted","Data":"4ff939dfdbad4b4f2789e1dd4382cec0566e01743e1560320da2a3d75964cfbb"} Nov 22 03:22:35 crc kubenswrapper[4952]: I1122 03:22:35.698252 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh" podStartSLOduration=2.959560462 podStartE2EDuration="3.69821612s" podCreationTimestamp="2025-11-22 03:22:32 +0000 UTC" firstStartedPulling="2025-11-22 03:22:33.729405516 +0000 UTC m=+1718.035422799" lastFinishedPulling="2025-11-22 03:22:34.468061134 +0000 UTC m=+1718.774078457" observedRunningTime="2025-11-22 03:22:35.686303433 +0000 UTC m=+1719.992320716" watchObservedRunningTime="2025-11-22 03:22:35.69821612 +0000 UTC m=+1720.004233433" Nov 22 03:22:45 crc kubenswrapper[4952]: I1122 03:22:45.777729 4952 generic.go:334] "Generic (PLEG): container finished" podID="a5d7236f-3993-41b9-a1da-bcf22abf958d" containerID="4ff939dfdbad4b4f2789e1dd4382cec0566e01743e1560320da2a3d75964cfbb" exitCode=0 Nov 22 03:22:45 crc kubenswrapper[4952]: I1122 03:22:45.777857 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh" event={"ID":"a5d7236f-3993-41b9-a1da-bcf22abf958d","Type":"ContainerDied","Data":"4ff939dfdbad4b4f2789e1dd4382cec0566e01743e1560320da2a3d75964cfbb"} Nov 22 03:22:46 crc kubenswrapper[4952]: I1122 03:22:46.040812 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k62bf"] Nov 22 03:22:46 crc kubenswrapper[4952]: I1122 03:22:46.060662 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k62bf"] Nov 22 03:22:46 crc kubenswrapper[4952]: I1122 03:22:46.070657 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-ng2bq"] Nov 22 03:22:46 crc kubenswrapper[4952]: I1122 03:22:46.081427 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-ng2bq"] Nov 22 03:22:46 crc kubenswrapper[4952]: I1122 03:22:46.545725 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:22:46 crc kubenswrapper[4952]: E1122 03:22:46.546007 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:22:46 crc kubenswrapper[4952]: I1122 03:22:46.547766 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e7d5803-2a94-4be7-874e-59415a346d19" path="/var/lib/kubelet/pods/8e7d5803-2a94-4be7-874e-59415a346d19/volumes" Nov 22 03:22:46 crc kubenswrapper[4952]: I1122 03:22:46.549175 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec9f0941-6a38-4fc6-bb6c-bdc23d78a279" path="/var/lib/kubelet/pods/ec9f0941-6a38-4fc6-bb6c-bdc23d78a279/volumes" Nov 22 03:22:48 crc kubenswrapper[4952]: I1122 03:22:47.215375 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh" Nov 22 03:22:48 crc kubenswrapper[4952]: I1122 03:22:47.321352 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5d7236f-3993-41b9-a1da-bcf22abf958d-ssh-key\") pod \"a5d7236f-3993-41b9-a1da-bcf22abf958d\" (UID: \"a5d7236f-3993-41b9-a1da-bcf22abf958d\") " Nov 22 03:22:48 crc kubenswrapper[4952]: I1122 03:22:47.321610 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5d7236f-3993-41b9-a1da-bcf22abf958d-inventory\") pod \"a5d7236f-3993-41b9-a1da-bcf22abf958d\" (UID: \"a5d7236f-3993-41b9-a1da-bcf22abf958d\") " Nov 22 03:22:48 crc kubenswrapper[4952]: I1122 03:22:47.321811 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95pnl\" (UniqueName: \"kubernetes.io/projected/a5d7236f-3993-41b9-a1da-bcf22abf958d-kube-api-access-95pnl\") pod \"a5d7236f-3993-41b9-a1da-bcf22abf958d\" (UID: \"a5d7236f-3993-41b9-a1da-bcf22abf958d\") " Nov 22 03:22:48 crc kubenswrapper[4952]: I1122 03:22:47.802831 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh" event={"ID":"a5d7236f-3993-41b9-a1da-bcf22abf958d","Type":"ContainerDied","Data":"d3cbd193f65de66a86761daa986d358d14e98b316e86f9804533a2745e5e075e"} Nov 22 03:22:48 crc kubenswrapper[4952]: I1122 03:22:47.802876 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3cbd193f65de66a86761daa986d358d14e98b316e86f9804533a2745e5e075e" Nov 22 03:22:48 crc kubenswrapper[4952]: I1122 03:22:47.802887 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh" Nov 22 03:22:48 crc kubenswrapper[4952]: I1122 03:22:47.931489 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5d7236f-3993-41b9-a1da-bcf22abf958d-kube-api-access-95pnl" (OuterVolumeSpecName: "kube-api-access-95pnl") pod "a5d7236f-3993-41b9-a1da-bcf22abf958d" (UID: "a5d7236f-3993-41b9-a1da-bcf22abf958d"). InnerVolumeSpecName "kube-api-access-95pnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:22:48 crc kubenswrapper[4952]: E1122 03:22:47.931617 4952 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5d7236f-3993-41b9-a1da-bcf22abf958d-ssh-key podName:a5d7236f-3993-41b9-a1da-bcf22abf958d nodeName:}" failed. No retries permitted until 2025-11-22 03:22:48.431509326 +0000 UTC m=+1732.737526599 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key" (UniqueName: "kubernetes.io/secret/a5d7236f-3993-41b9-a1da-bcf22abf958d-ssh-key") pod "a5d7236f-3993-41b9-a1da-bcf22abf958d" (UID: "a5d7236f-3993-41b9-a1da-bcf22abf958d") : error deleting /var/lib/kubelet/pods/a5d7236f-3993-41b9-a1da-bcf22abf958d/volume-subpaths: remove /var/lib/kubelet/pods/a5d7236f-3993-41b9-a1da-bcf22abf958d/volume-subpaths: no such file or directory Nov 22 03:22:48 crc kubenswrapper[4952]: I1122 03:22:47.934242 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95pnl\" (UniqueName: \"kubernetes.io/projected/a5d7236f-3993-41b9-a1da-bcf22abf958d-kube-api-access-95pnl\") on node \"crc\" DevicePath \"\"" Nov 22 03:22:48 crc kubenswrapper[4952]: I1122 03:22:47.942296 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5d7236f-3993-41b9-a1da-bcf22abf958d-inventory" (OuterVolumeSpecName: "inventory") pod "a5d7236f-3993-41b9-a1da-bcf22abf958d" (UID: "a5d7236f-3993-41b9-a1da-bcf22abf958d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:22:48 crc kubenswrapper[4952]: I1122 03:22:48.036019 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5d7236f-3993-41b9-a1da-bcf22abf958d-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:22:48 crc kubenswrapper[4952]: I1122 03:22:48.444563 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5d7236f-3993-41b9-a1da-bcf22abf958d-ssh-key\") pod \"a5d7236f-3993-41b9-a1da-bcf22abf958d\" (UID: \"a5d7236f-3993-41b9-a1da-bcf22abf958d\") " Nov 22 03:22:48 crc kubenswrapper[4952]: I1122 03:22:48.453785 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5d7236f-3993-41b9-a1da-bcf22abf958d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a5d7236f-3993-41b9-a1da-bcf22abf958d" (UID: "a5d7236f-3993-41b9-a1da-bcf22abf958d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:22:48 crc kubenswrapper[4952]: I1122 03:22:48.546915 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5d7236f-3993-41b9-a1da-bcf22abf958d-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:23:00 crc kubenswrapper[4952]: I1122 03:23:00.531912 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:23:00 crc kubenswrapper[4952]: E1122 03:23:00.533338 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:23:00 crc kubenswrapper[4952]: I1122 03:23:00.616174 4952 scope.go:117] "RemoveContainer" containerID="ae364a3668a54a6d4f786a4c0d76755aadde0432358b62b044d5b0f767ccc98b" Nov 22 03:23:00 crc kubenswrapper[4952]: I1122 03:23:00.684719 4952 scope.go:117] "RemoveContainer" containerID="b0bc29ddcfabb4faf17c7ef461f2adc42889ee8b2d974dc2d1ea21a1a9a908e4" Nov 22 03:23:00 crc kubenswrapper[4952]: I1122 03:23:00.806335 4952 scope.go:117] "RemoveContainer" containerID="9c1b04376b7b9ff7f3b32665c342ea62e578e26d4d3df5d6910626f8d4d20806" Nov 22 03:23:12 crc kubenswrapper[4952]: I1122 03:23:12.531695 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:23:12 crc kubenswrapper[4952]: E1122 03:23:12.532905 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:23:23 crc kubenswrapper[4952]: I1122 03:23:23.531411 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:23:23 crc kubenswrapper[4952]: E1122 03:23:23.532878 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:23:32 crc kubenswrapper[4952]: I1122 03:23:32.059607 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-l4xwk"] Nov 22 03:23:32 crc kubenswrapper[4952]: I1122 03:23:32.102299 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-l4xwk"] Nov 22 03:23:32 crc kubenswrapper[4952]: I1122 03:23:32.545242 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04233ec3-e35d-4cb2-959d-2fad451655d2" path="/var/lib/kubelet/pods/04233ec3-e35d-4cb2-959d-2fad451655d2/volumes" Nov 22 03:23:36 crc kubenswrapper[4952]: I1122 03:23:36.566091 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:23:36 crc kubenswrapper[4952]: E1122 03:23:36.566984 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:23:48 crc kubenswrapper[4952]: I1122 03:23:48.531844 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:23:48 crc kubenswrapper[4952]: E1122 03:23:48.533050 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:24:00 crc kubenswrapper[4952]: I1122 03:24:00.914830 4952 scope.go:117] "RemoveContainer" containerID="7aa8ed29b36a3bae662ab78715137eaf060b736a0f6a1049dc7146ab55e8667a" Nov 22 03:24:02 crc kubenswrapper[4952]: I1122 03:24:02.531878 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:24:03 crc kubenswrapper[4952]: I1122 03:24:03.660452 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"91d224567de3e8c5e7b35b6e9663db90e384faa49fc63bd2dfda4142a43c0df8"} Nov 22 03:26:28 crc kubenswrapper[4952]: I1122 03:26:28.341672 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:26:28 crc kubenswrapper[4952]: I1122 03:26:28.342489 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:26:58 crc kubenswrapper[4952]: I1122 03:26:58.341781 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:26:58 crc kubenswrapper[4952]: I1122 03:26:58.342834 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:27:28 crc kubenswrapper[4952]: I1122 03:27:28.342136 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:27:28 crc kubenswrapper[4952]: I1122 03:27:28.342991 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:27:28 crc kubenswrapper[4952]: I1122 03:27:28.343045 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 03:27:28 crc kubenswrapper[4952]: I1122 03:27:28.343903 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"91d224567de3e8c5e7b35b6e9663db90e384faa49fc63bd2dfda4142a43c0df8"} pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:27:28 crc kubenswrapper[4952]: I1122 03:27:28.343968 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" containerID="cri-o://91d224567de3e8c5e7b35b6e9663db90e384faa49fc63bd2dfda4142a43c0df8" gracePeriod=600 Nov 22 03:27:29 crc kubenswrapper[4952]: I1122 03:27:29.308334 4952 generic.go:334] "Generic (PLEG): container finished" podID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerID="91d224567de3e8c5e7b35b6e9663db90e384faa49fc63bd2dfda4142a43c0df8" exitCode=0 Nov 22 03:27:29 crc kubenswrapper[4952]: I1122 03:27:29.308424 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerDied","Data":"91d224567de3e8c5e7b35b6e9663db90e384faa49fc63bd2dfda4142a43c0df8"} Nov 22 03:27:29 crc kubenswrapper[4952]: I1122 03:27:29.309450 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b"} Nov 22 03:27:29 crc kubenswrapper[4952]: I1122 03:27:29.309488 4952 scope.go:117] "RemoveContainer" containerID="9d735497af9dc617396adde93f7544220a6f22d0a0f383fa329a5e92f54bc861" Nov 22 03:27:38 crc kubenswrapper[4952]: I1122 03:27:38.704560 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76"] Nov 22 03:27:38 crc kubenswrapper[4952]: I1122 03:27:38.711490 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f"] Nov 22 03:27:38 crc kubenswrapper[4952]: I1122 03:27:38.716855 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-blkhx"] Nov 22 03:27:38 crc kubenswrapper[4952]: I1122 03:27:38.726603 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff"] Nov 22 03:27:38 crc kubenswrapper[4952]: I1122 03:27:38.731189 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fjx76"] Nov 22 03:27:38 crc kubenswrapper[4952]: I1122 03:27:38.736087 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc"] Nov 22 03:27:38 crc kubenswrapper[4952]: I1122 03:27:38.740827 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp"] Nov 22 03:27:38 crc kubenswrapper[4952]: I1122 03:27:38.745752 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh"] Nov 22 03:27:38 crc kubenswrapper[4952]: I1122 03:27:38.750719 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-blkhx"] Nov 22 03:27:38 crc kubenswrapper[4952]: I1122 03:27:38.755728 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc"] Nov 22 03:27:38 crc kubenswrapper[4952]: I1122 03:27:38.760696 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t7s9f"] Nov 22 03:27:38 crc kubenswrapper[4952]: I1122 03:27:38.765998 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw"] Nov 22 03:27:38 crc kubenswrapper[4952]: I1122 03:27:38.771523 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hkzff"] Nov 22 03:27:38 crc kubenswrapper[4952]: I1122 03:27:38.776412 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4dmc"] Nov 22 03:27:38 crc kubenswrapper[4952]: I1122 03:27:38.781337 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6"] Nov 22 03:27:38 crc kubenswrapper[4952]: I1122 03:27:38.786083 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-98ltw"] Nov 22 03:27:38 crc kubenswrapper[4952]: I1122 03:27:38.791464 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vclrp"] Nov 22 03:27:38 crc kubenswrapper[4952]: I1122 03:27:38.797579 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-h6fbh"] Nov 22 03:27:38 crc kubenswrapper[4952]: I1122 03:27:38.802885 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqqjc"] Nov 22 03:27:38 crc kubenswrapper[4952]: I1122 03:27:38.809146 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8ztc6"] Nov 22 03:27:40 crc kubenswrapper[4952]: I1122 03:27:40.548904 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a5a968b-0fa9-4656-bd4f-8d84a160ce8e" path="/var/lib/kubelet/pods/1a5a968b-0fa9-4656-bd4f-8d84a160ce8e/volumes" Nov 22 03:27:40 crc kubenswrapper[4952]: I1122 03:27:40.549893 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c758bb2-46c5-4fbb-97de-ee24a3648250" path="/var/lib/kubelet/pods/2c758bb2-46c5-4fbb-97de-ee24a3648250/volumes" Nov 22 03:27:40 crc kubenswrapper[4952]: I1122 03:27:40.550487 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="460e80b9-651b-4dc8-a82e-f9a45d9f18ac" path="/var/lib/kubelet/pods/460e80b9-651b-4dc8-a82e-f9a45d9f18ac/volumes" Nov 22 03:27:40 crc kubenswrapper[4952]: I1122 03:27:40.551021 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="562bf029-85bf-47a6-b4a7-913eb130f85b" path="/var/lib/kubelet/pods/562bf029-85bf-47a6-b4a7-913eb130f85b/volumes" Nov 22 03:27:40 crc kubenswrapper[4952]: I1122 03:27:40.552116 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c8715b0-7a1d-465a-9fb6-8024e98f6047" path="/var/lib/kubelet/pods/9c8715b0-7a1d-465a-9fb6-8024e98f6047/volumes" Nov 22 03:27:40 crc kubenswrapper[4952]: I1122 03:27:40.552629 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5d7236f-3993-41b9-a1da-bcf22abf958d" path="/var/lib/kubelet/pods/a5d7236f-3993-41b9-a1da-bcf22abf958d/volumes" Nov 22 03:27:40 crc kubenswrapper[4952]: I1122 03:27:40.553170 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfc29e37-de29-439c-805d-c2f92e6bd117" path="/var/lib/kubelet/pods/cfc29e37-de29-439c-805d-c2f92e6bd117/volumes" Nov 22 03:27:40 crc kubenswrapper[4952]: I1122 03:27:40.554219 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0516bf1-c92e-4982-b105-985052d0410a" path="/var/lib/kubelet/pods/d0516bf1-c92e-4982-b105-985052d0410a/volumes" Nov 22 03:27:40 crc kubenswrapper[4952]: I1122 03:27:40.554838 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1bfe334-0090-44fc-9132-ddd3bbc810b1" path="/var/lib/kubelet/pods/d1bfe334-0090-44fc-9132-ddd3bbc810b1/volumes" Nov 22 03:27:40 crc kubenswrapper[4952]: I1122 03:27:40.555428 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1c64327-794d-4559-9b2b-ad7b9ba81dd1" path="/var/lib/kubelet/pods/f1c64327-794d-4559-9b2b-ad7b9ba81dd1/volumes" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.638201 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7"] Nov 22 03:27:44 crc kubenswrapper[4952]: E1122 03:27:44.639140 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5d7236f-3993-41b9-a1da-bcf22abf958d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.639154 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5d7236f-3993-41b9-a1da-bcf22abf958d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.639318 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5d7236f-3993-41b9-a1da-bcf22abf958d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.639958 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.642154 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.642470 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.642517 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.643054 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.645380 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.648190 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7"] Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.706666 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7\" (UID: \"36289a09-ac23-452c-b88f-9eba30618fe3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.706736 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsx75\" (UniqueName: \"kubernetes.io/projected/36289a09-ac23-452c-b88f-9eba30618fe3-kube-api-access-dsx75\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7\" (UID: \"36289a09-ac23-452c-b88f-9eba30618fe3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.706816 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7\" (UID: \"36289a09-ac23-452c-b88f-9eba30618fe3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.706837 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7\" (UID: \"36289a09-ac23-452c-b88f-9eba30618fe3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.706880 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7\" (UID: \"36289a09-ac23-452c-b88f-9eba30618fe3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.808103 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7\" (UID: \"36289a09-ac23-452c-b88f-9eba30618fe3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.808218 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsx75\" (UniqueName: \"kubernetes.io/projected/36289a09-ac23-452c-b88f-9eba30618fe3-kube-api-access-dsx75\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7\" (UID: \"36289a09-ac23-452c-b88f-9eba30618fe3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.808279 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7\" (UID: \"36289a09-ac23-452c-b88f-9eba30618fe3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.808304 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7\" (UID: \"36289a09-ac23-452c-b88f-9eba30618fe3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.808361 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7\" (UID: \"36289a09-ac23-452c-b88f-9eba30618fe3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.815268 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7\" (UID: \"36289a09-ac23-452c-b88f-9eba30618fe3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.815703 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7\" (UID: \"36289a09-ac23-452c-b88f-9eba30618fe3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.818296 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7\" (UID: \"36289a09-ac23-452c-b88f-9eba30618fe3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.823504 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7\" (UID: \"36289a09-ac23-452c-b88f-9eba30618fe3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.833912 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsx75\" (UniqueName: \"kubernetes.io/projected/36289a09-ac23-452c-b88f-9eba30618fe3-kube-api-access-dsx75\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7\" (UID: \"36289a09-ac23-452c-b88f-9eba30618fe3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" Nov 22 03:27:44 crc kubenswrapper[4952]: I1122 03:27:44.995529 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" Nov 22 03:27:45 crc kubenswrapper[4952]: I1122 03:27:45.629832 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7"] Nov 22 03:27:45 crc kubenswrapper[4952]: I1122 03:27:45.632042 4952 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 03:27:46 crc kubenswrapper[4952]: I1122 03:27:46.489630 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" event={"ID":"36289a09-ac23-452c-b88f-9eba30618fe3","Type":"ContainerStarted","Data":"84f017246b3780c1df168d66cb1721c435e16f1a92b9519b4d909fd6bdc5c7c7"} Nov 22 03:27:46 crc kubenswrapper[4952]: I1122 03:27:46.490359 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" event={"ID":"36289a09-ac23-452c-b88f-9eba30618fe3","Type":"ContainerStarted","Data":"978ec95b0ba704ebadf3e822c8ce49915f680c9254a36eba232ff94c9f69cb4a"} Nov 22 03:27:46 crc kubenswrapper[4952]: I1122 03:27:46.510880 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" podStartSLOduration=1.98593022 podStartE2EDuration="2.51084761s" podCreationTimestamp="2025-11-22 03:27:44 +0000 UTC" firstStartedPulling="2025-11-22 03:27:45.631753881 +0000 UTC m=+2029.937771164" lastFinishedPulling="2025-11-22 03:27:46.156671281 +0000 UTC m=+2030.462688554" observedRunningTime="2025-11-22 03:27:46.50749085 +0000 UTC m=+2030.813508113" watchObservedRunningTime="2025-11-22 03:27:46.51084761 +0000 UTC m=+2030.816864923" Nov 22 03:27:58 crc kubenswrapper[4952]: I1122 03:27:58.604887 4952 generic.go:334] "Generic (PLEG): container finished" podID="36289a09-ac23-452c-b88f-9eba30618fe3" containerID="84f017246b3780c1df168d66cb1721c435e16f1a92b9519b4d909fd6bdc5c7c7" exitCode=0 Nov 22 03:27:58 crc kubenswrapper[4952]: I1122 03:27:58.605068 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" event={"ID":"36289a09-ac23-452c-b88f-9eba30618fe3","Type":"ContainerDied","Data":"84f017246b3780c1df168d66cb1721c435e16f1a92b9519b4d909fd6bdc5c7c7"} Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.077428 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.145591 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsx75\" (UniqueName: \"kubernetes.io/projected/36289a09-ac23-452c-b88f-9eba30618fe3-kube-api-access-dsx75\") pod \"36289a09-ac23-452c-b88f-9eba30618fe3\" (UID: \"36289a09-ac23-452c-b88f-9eba30618fe3\") " Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.145711 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-ceph\") pod \"36289a09-ac23-452c-b88f-9eba30618fe3\" (UID: \"36289a09-ac23-452c-b88f-9eba30618fe3\") " Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.145755 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-ssh-key\") pod \"36289a09-ac23-452c-b88f-9eba30618fe3\" (UID: \"36289a09-ac23-452c-b88f-9eba30618fe3\") " Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.145815 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-inventory\") pod \"36289a09-ac23-452c-b88f-9eba30618fe3\" (UID: \"36289a09-ac23-452c-b88f-9eba30618fe3\") " Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.145941 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-repo-setup-combined-ca-bundle\") pod \"36289a09-ac23-452c-b88f-9eba30618fe3\" (UID: \"36289a09-ac23-452c-b88f-9eba30618fe3\") " Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.153511 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36289a09-ac23-452c-b88f-9eba30618fe3-kube-api-access-dsx75" (OuterVolumeSpecName: "kube-api-access-dsx75") pod "36289a09-ac23-452c-b88f-9eba30618fe3" (UID: "36289a09-ac23-452c-b88f-9eba30618fe3"). InnerVolumeSpecName "kube-api-access-dsx75". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.158383 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "36289a09-ac23-452c-b88f-9eba30618fe3" (UID: "36289a09-ac23-452c-b88f-9eba30618fe3"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.158460 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-ceph" (OuterVolumeSpecName: "ceph") pod "36289a09-ac23-452c-b88f-9eba30618fe3" (UID: "36289a09-ac23-452c-b88f-9eba30618fe3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.194953 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "36289a09-ac23-452c-b88f-9eba30618fe3" (UID: "36289a09-ac23-452c-b88f-9eba30618fe3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.196736 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-inventory" (OuterVolumeSpecName: "inventory") pod "36289a09-ac23-452c-b88f-9eba30618fe3" (UID: "36289a09-ac23-452c-b88f-9eba30618fe3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.248496 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.249019 4952 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.249047 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsx75\" (UniqueName: \"kubernetes.io/projected/36289a09-ac23-452c-b88f-9eba30618fe3-kube-api-access-dsx75\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.249066 4952 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.249081 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36289a09-ac23-452c-b88f-9eba30618fe3-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.631708 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" event={"ID":"36289a09-ac23-452c-b88f-9eba30618fe3","Type":"ContainerDied","Data":"978ec95b0ba704ebadf3e822c8ce49915f680c9254a36eba232ff94c9f69cb4a"} Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.631758 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="978ec95b0ba704ebadf3e822c8ce49915f680c9254a36eba232ff94c9f69cb4a" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.631983 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.772162 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq"] Nov 22 03:28:00 crc kubenswrapper[4952]: E1122 03:28:00.772536 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36289a09-ac23-452c-b88f-9eba30618fe3" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.772567 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="36289a09-ac23-452c-b88f-9eba30618fe3" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.772804 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="36289a09-ac23-452c-b88f-9eba30618fe3" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.773655 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.775759 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.777073 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.777263 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.777398 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.777509 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.788263 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq"] Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.960864 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq\" (UID: \"10f9b191-e7da-494f-b29d-b0594d9044c2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.960957 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq\" (UID: \"10f9b191-e7da-494f-b29d-b0594d9044c2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.961308 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq\" (UID: \"10f9b191-e7da-494f-b29d-b0594d9044c2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.961421 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cntsx\" (UniqueName: \"kubernetes.io/projected/10f9b191-e7da-494f-b29d-b0594d9044c2-kube-api-access-cntsx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq\" (UID: \"10f9b191-e7da-494f-b29d-b0594d9044c2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" Nov 22 03:28:00 crc kubenswrapper[4952]: I1122 03:28:00.961734 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq\" (UID: \"10f9b191-e7da-494f-b29d-b0594d9044c2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" Nov 22 03:28:01 crc kubenswrapper[4952]: I1122 03:28:01.063510 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq\" (UID: \"10f9b191-e7da-494f-b29d-b0594d9044c2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" Nov 22 03:28:01 crc kubenswrapper[4952]: I1122 03:28:01.063702 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq\" (UID: \"10f9b191-e7da-494f-b29d-b0594d9044c2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" Nov 22 03:28:01 crc kubenswrapper[4952]: I1122 03:28:01.063978 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cntsx\" (UniqueName: \"kubernetes.io/projected/10f9b191-e7da-494f-b29d-b0594d9044c2-kube-api-access-cntsx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq\" (UID: \"10f9b191-e7da-494f-b29d-b0594d9044c2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" Nov 22 03:28:01 crc kubenswrapper[4952]: I1122 03:28:01.064124 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq\" (UID: \"10f9b191-e7da-494f-b29d-b0594d9044c2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" Nov 22 03:28:01 crc kubenswrapper[4952]: I1122 03:28:01.064185 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq\" (UID: \"10f9b191-e7da-494f-b29d-b0594d9044c2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" Nov 22 03:28:01 crc kubenswrapper[4952]: I1122 03:28:01.071229 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq\" (UID: \"10f9b191-e7da-494f-b29d-b0594d9044c2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" Nov 22 03:28:01 crc kubenswrapper[4952]: I1122 03:28:01.072787 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq\" (UID: \"10f9b191-e7da-494f-b29d-b0594d9044c2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" Nov 22 03:28:01 crc kubenswrapper[4952]: I1122 03:28:01.073051 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq\" (UID: \"10f9b191-e7da-494f-b29d-b0594d9044c2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" Nov 22 03:28:01 crc kubenswrapper[4952]: I1122 03:28:01.079378 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq\" (UID: \"10f9b191-e7da-494f-b29d-b0594d9044c2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" Nov 22 03:28:01 crc kubenswrapper[4952]: I1122 03:28:01.082764 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cntsx\" (UniqueName: \"kubernetes.io/projected/10f9b191-e7da-494f-b29d-b0594d9044c2-kube-api-access-cntsx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq\" (UID: \"10f9b191-e7da-494f-b29d-b0594d9044c2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" Nov 22 03:28:01 crc kubenswrapper[4952]: I1122 03:28:01.088721 4952 scope.go:117] "RemoveContainer" containerID="3f53ac2e02db381cd7313d67028bb1a1c324b6969d874dd1b637249d76deae76" Nov 22 03:28:01 crc kubenswrapper[4952]: I1122 03:28:01.096991 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" Nov 22 03:28:01 crc kubenswrapper[4952]: I1122 03:28:01.260421 4952 scope.go:117] "RemoveContainer" containerID="4a1a8716a112bd2735614675416540a91718d7bde91565c87bc377779d013126" Nov 22 03:28:01 crc kubenswrapper[4952]: I1122 03:28:01.325122 4952 scope.go:117] "RemoveContainer" containerID="fce6a3395361c8740b39d4467840427834d826c7b1fb32c5f5321af038786ec7" Nov 22 03:28:01 crc kubenswrapper[4952]: I1122 03:28:01.418932 4952 scope.go:117] "RemoveContainer" containerID="c4524bf3969497a964a75ec9b196ab3741ec07551dad1445d36d5bf2aa73e3b6" Nov 22 03:28:01 crc kubenswrapper[4952]: I1122 03:28:01.458375 4952 scope.go:117] "RemoveContainer" containerID="4c76599daa9600b66f7c6959563c3afb6e57484199f10a45808022501551bbcf" Nov 22 03:28:01 crc kubenswrapper[4952]: I1122 03:28:01.462215 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq"] Nov 22 03:28:01 crc kubenswrapper[4952]: I1122 03:28:01.502292 4952 scope.go:117] "RemoveContainer" containerID="08481c41dae352c3095509139b1f03e4c6e0e8624a9014dc976c7c698cc0ea65" Nov 22 03:28:01 crc kubenswrapper[4952]: I1122 03:28:01.553198 4952 scope.go:117] "RemoveContainer" containerID="8cdfc3bb0f74a692f48fa4d8c8374acaab432f9ea8dd492c50ef96503e0113e6" Nov 22 03:28:01 crc kubenswrapper[4952]: I1122 03:28:01.652783 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" event={"ID":"10f9b191-e7da-494f-b29d-b0594d9044c2","Type":"ContainerStarted","Data":"9606e85d0fabe846028f6b55925beb2093b47195937cba4ed0593bb678433fa1"} Nov 22 03:28:02 crc kubenswrapper[4952]: I1122 03:28:02.667934 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" event={"ID":"10f9b191-e7da-494f-b29d-b0594d9044c2","Type":"ContainerStarted","Data":"743747fdf5005373862c36392afee960aab1d9f89a9433c518c15210f7f29727"} Nov 22 03:28:02 crc kubenswrapper[4952]: I1122 03:28:02.713461 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" podStartSLOduration=2.21443762 podStartE2EDuration="2.713425612s" podCreationTimestamp="2025-11-22 03:28:00 +0000 UTC" firstStartedPulling="2025-11-22 03:28:01.475343806 +0000 UTC m=+2045.781361079" lastFinishedPulling="2025-11-22 03:28:01.974331778 +0000 UTC m=+2046.280349071" observedRunningTime="2025-11-22 03:28:02.69151867 +0000 UTC m=+2046.997535983" watchObservedRunningTime="2025-11-22 03:28:02.713425612 +0000 UTC m=+2047.019442925" Nov 22 03:28:16 crc kubenswrapper[4952]: I1122 03:28:16.813388 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bhzjn"] Nov 22 03:28:16 crc kubenswrapper[4952]: I1122 03:28:16.816360 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhzjn" Nov 22 03:28:16 crc kubenswrapper[4952]: I1122 03:28:16.833887 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bhzjn"] Nov 22 03:28:16 crc kubenswrapper[4952]: I1122 03:28:16.906197 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc2rm\" (UniqueName: \"kubernetes.io/projected/1fd99427-479c-43b6-8019-c0064cd9eadb-kube-api-access-rc2rm\") pod \"certified-operators-bhzjn\" (UID: \"1fd99427-479c-43b6-8019-c0064cd9eadb\") " pod="openshift-marketplace/certified-operators-bhzjn" Nov 22 03:28:16 crc kubenswrapper[4952]: I1122 03:28:16.906638 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd99427-479c-43b6-8019-c0064cd9eadb-utilities\") pod \"certified-operators-bhzjn\" (UID: \"1fd99427-479c-43b6-8019-c0064cd9eadb\") " pod="openshift-marketplace/certified-operators-bhzjn" Nov 22 03:28:16 crc kubenswrapper[4952]: I1122 03:28:16.906707 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd99427-479c-43b6-8019-c0064cd9eadb-catalog-content\") pod \"certified-operators-bhzjn\" (UID: \"1fd99427-479c-43b6-8019-c0064cd9eadb\") " pod="openshift-marketplace/certified-operators-bhzjn" Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.008030 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc2rm\" (UniqueName: \"kubernetes.io/projected/1fd99427-479c-43b6-8019-c0064cd9eadb-kube-api-access-rc2rm\") pod \"certified-operators-bhzjn\" (UID: \"1fd99427-479c-43b6-8019-c0064cd9eadb\") " pod="openshift-marketplace/certified-operators-bhzjn" Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.008119 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd99427-479c-43b6-8019-c0064cd9eadb-utilities\") pod \"certified-operators-bhzjn\" (UID: \"1fd99427-479c-43b6-8019-c0064cd9eadb\") " pod="openshift-marketplace/certified-operators-bhzjn" Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.008170 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd99427-479c-43b6-8019-c0064cd9eadb-catalog-content\") pod \"certified-operators-bhzjn\" (UID: \"1fd99427-479c-43b6-8019-c0064cd9eadb\") " pod="openshift-marketplace/certified-operators-bhzjn" Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.008691 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd99427-479c-43b6-8019-c0064cd9eadb-utilities\") pod \"certified-operators-bhzjn\" (UID: \"1fd99427-479c-43b6-8019-c0064cd9eadb\") " pod="openshift-marketplace/certified-operators-bhzjn" Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.008728 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd99427-479c-43b6-8019-c0064cd9eadb-catalog-content\") pod \"certified-operators-bhzjn\" (UID: \"1fd99427-479c-43b6-8019-c0064cd9eadb\") " pod="openshift-marketplace/certified-operators-bhzjn" Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.010804 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zfcnd"] Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.012523 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zfcnd" Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.025433 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zfcnd"] Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.056201 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc2rm\" (UniqueName: \"kubernetes.io/projected/1fd99427-479c-43b6-8019-c0064cd9eadb-kube-api-access-rc2rm\") pod \"certified-operators-bhzjn\" (UID: \"1fd99427-479c-43b6-8019-c0064cd9eadb\") " pod="openshift-marketplace/certified-operators-bhzjn" Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.110321 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfmxg\" (UniqueName: \"kubernetes.io/projected/7860fb18-cf1c-4439-a4e4-0d17de2afa76-kube-api-access-kfmxg\") pod \"community-operators-zfcnd\" (UID: \"7860fb18-cf1c-4439-a4e4-0d17de2afa76\") " pod="openshift-marketplace/community-operators-zfcnd" Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.110402 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7860fb18-cf1c-4439-a4e4-0d17de2afa76-utilities\") pod \"community-operators-zfcnd\" (UID: \"7860fb18-cf1c-4439-a4e4-0d17de2afa76\") " pod="openshift-marketplace/community-operators-zfcnd" Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.110510 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7860fb18-cf1c-4439-a4e4-0d17de2afa76-catalog-content\") pod \"community-operators-zfcnd\" (UID: \"7860fb18-cf1c-4439-a4e4-0d17de2afa76\") " pod="openshift-marketplace/community-operators-zfcnd" Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.163249 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhzjn" Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.214122 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7860fb18-cf1c-4439-a4e4-0d17de2afa76-utilities\") pod \"community-operators-zfcnd\" (UID: \"7860fb18-cf1c-4439-a4e4-0d17de2afa76\") " pod="openshift-marketplace/community-operators-zfcnd" Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.214250 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7860fb18-cf1c-4439-a4e4-0d17de2afa76-catalog-content\") pod \"community-operators-zfcnd\" (UID: \"7860fb18-cf1c-4439-a4e4-0d17de2afa76\") " pod="openshift-marketplace/community-operators-zfcnd" Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.214324 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfmxg\" (UniqueName: \"kubernetes.io/projected/7860fb18-cf1c-4439-a4e4-0d17de2afa76-kube-api-access-kfmxg\") pod \"community-operators-zfcnd\" (UID: \"7860fb18-cf1c-4439-a4e4-0d17de2afa76\") " pod="openshift-marketplace/community-operators-zfcnd" Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.214805 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7860fb18-cf1c-4439-a4e4-0d17de2afa76-utilities\") pod \"community-operators-zfcnd\" (UID: \"7860fb18-cf1c-4439-a4e4-0d17de2afa76\") " pod="openshift-marketplace/community-operators-zfcnd" Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.214824 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7860fb18-cf1c-4439-a4e4-0d17de2afa76-catalog-content\") pod \"community-operators-zfcnd\" (UID: \"7860fb18-cf1c-4439-a4e4-0d17de2afa76\") " pod="openshift-marketplace/community-operators-zfcnd" Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.235301 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfmxg\" (UniqueName: \"kubernetes.io/projected/7860fb18-cf1c-4439-a4e4-0d17de2afa76-kube-api-access-kfmxg\") pod \"community-operators-zfcnd\" (UID: \"7860fb18-cf1c-4439-a4e4-0d17de2afa76\") " pod="openshift-marketplace/community-operators-zfcnd" Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.340224 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zfcnd" Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.748187 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zfcnd"] Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.821339 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfcnd" event={"ID":"7860fb18-cf1c-4439-a4e4-0d17de2afa76","Type":"ContainerStarted","Data":"43d311e295db2c5a790c3c380042a836194d7ad59a666609e886d40975c49211"} Nov 22 03:28:17 crc kubenswrapper[4952]: I1122 03:28:17.832579 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bhzjn"] Nov 22 03:28:17 crc kubenswrapper[4952]: W1122 03:28:17.838271 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fd99427_479c_43b6_8019_c0064cd9eadb.slice/crio-f3f62957d2835b78b11005401ea46124f17e02837860b5f5c4f4c2b0594d73ab WatchSource:0}: Error finding container f3f62957d2835b78b11005401ea46124f17e02837860b5f5c4f4c2b0594d73ab: Status 404 returned error can't find the container with id f3f62957d2835b78b11005401ea46124f17e02837860b5f5c4f4c2b0594d73ab Nov 22 03:28:18 crc kubenswrapper[4952]: I1122 03:28:18.839930 4952 generic.go:334] "Generic (PLEG): container finished" podID="1fd99427-479c-43b6-8019-c0064cd9eadb" containerID="629c1014ae88abd7897c3af2e53dc7ee469e64cb2c19e960322b165e4c8305f5" exitCode=0 Nov 22 03:28:18 crc kubenswrapper[4952]: I1122 03:28:18.840029 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhzjn" event={"ID":"1fd99427-479c-43b6-8019-c0064cd9eadb","Type":"ContainerDied","Data":"629c1014ae88abd7897c3af2e53dc7ee469e64cb2c19e960322b165e4c8305f5"} Nov 22 03:28:18 crc kubenswrapper[4952]: I1122 03:28:18.840611 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhzjn" event={"ID":"1fd99427-479c-43b6-8019-c0064cd9eadb","Type":"ContainerStarted","Data":"f3f62957d2835b78b11005401ea46124f17e02837860b5f5c4f4c2b0594d73ab"} Nov 22 03:28:18 crc kubenswrapper[4952]: I1122 03:28:18.845799 4952 generic.go:334] "Generic (PLEG): container finished" podID="7860fb18-cf1c-4439-a4e4-0d17de2afa76" containerID="8a71a8621fa9da8893003a78342972ed9a5bbc9a5f5542161d3f647ce030b097" exitCode=0 Nov 22 03:28:18 crc kubenswrapper[4952]: I1122 03:28:18.845881 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfcnd" event={"ID":"7860fb18-cf1c-4439-a4e4-0d17de2afa76","Type":"ContainerDied","Data":"8a71a8621fa9da8893003a78342972ed9a5bbc9a5f5542161d3f647ce030b097"} Nov 22 03:28:19 crc kubenswrapper[4952]: I1122 03:28:19.218156 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zmzds"] Nov 22 03:28:19 crc kubenswrapper[4952]: I1122 03:28:19.224574 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zmzds" Nov 22 03:28:19 crc kubenswrapper[4952]: I1122 03:28:19.246663 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmzds"] Nov 22 03:28:19 crc kubenswrapper[4952]: I1122 03:28:19.262035 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daa31f84-2af2-42c5-95c6-bb045cb8298d-utilities\") pod \"redhat-marketplace-zmzds\" (UID: \"daa31f84-2af2-42c5-95c6-bb045cb8298d\") " pod="openshift-marketplace/redhat-marketplace-zmzds" Nov 22 03:28:19 crc kubenswrapper[4952]: I1122 03:28:19.262150 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lnrc\" (UniqueName: \"kubernetes.io/projected/daa31f84-2af2-42c5-95c6-bb045cb8298d-kube-api-access-6lnrc\") pod \"redhat-marketplace-zmzds\" (UID: \"daa31f84-2af2-42c5-95c6-bb045cb8298d\") " pod="openshift-marketplace/redhat-marketplace-zmzds" Nov 22 03:28:19 crc kubenswrapper[4952]: I1122 03:28:19.262179 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daa31f84-2af2-42c5-95c6-bb045cb8298d-catalog-content\") pod \"redhat-marketplace-zmzds\" (UID: \"daa31f84-2af2-42c5-95c6-bb045cb8298d\") " pod="openshift-marketplace/redhat-marketplace-zmzds" Nov 22 03:28:19 crc kubenswrapper[4952]: I1122 03:28:19.365205 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daa31f84-2af2-42c5-95c6-bb045cb8298d-utilities\") pod \"redhat-marketplace-zmzds\" (UID: \"daa31f84-2af2-42c5-95c6-bb045cb8298d\") " pod="openshift-marketplace/redhat-marketplace-zmzds" Nov 22 03:28:19 crc kubenswrapper[4952]: I1122 03:28:19.365299 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lnrc\" (UniqueName: \"kubernetes.io/projected/daa31f84-2af2-42c5-95c6-bb045cb8298d-kube-api-access-6lnrc\") pod \"redhat-marketplace-zmzds\" (UID: \"daa31f84-2af2-42c5-95c6-bb045cb8298d\") " pod="openshift-marketplace/redhat-marketplace-zmzds" Nov 22 03:28:19 crc kubenswrapper[4952]: I1122 03:28:19.365342 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daa31f84-2af2-42c5-95c6-bb045cb8298d-catalog-content\") pod \"redhat-marketplace-zmzds\" (UID: \"daa31f84-2af2-42c5-95c6-bb045cb8298d\") " pod="openshift-marketplace/redhat-marketplace-zmzds" Nov 22 03:28:19 crc kubenswrapper[4952]: I1122 03:28:19.365959 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daa31f84-2af2-42c5-95c6-bb045cb8298d-catalog-content\") pod \"redhat-marketplace-zmzds\" (UID: \"daa31f84-2af2-42c5-95c6-bb045cb8298d\") " pod="openshift-marketplace/redhat-marketplace-zmzds" Nov 22 03:28:19 crc kubenswrapper[4952]: I1122 03:28:19.366228 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daa31f84-2af2-42c5-95c6-bb045cb8298d-utilities\") pod \"redhat-marketplace-zmzds\" (UID: \"daa31f84-2af2-42c5-95c6-bb045cb8298d\") " pod="openshift-marketplace/redhat-marketplace-zmzds" Nov 22 03:28:19 crc kubenswrapper[4952]: I1122 03:28:19.401914 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lnrc\" (UniqueName: \"kubernetes.io/projected/daa31f84-2af2-42c5-95c6-bb045cb8298d-kube-api-access-6lnrc\") pod \"redhat-marketplace-zmzds\" (UID: \"daa31f84-2af2-42c5-95c6-bb045cb8298d\") " pod="openshift-marketplace/redhat-marketplace-zmzds" Nov 22 03:28:19 crc kubenswrapper[4952]: I1122 03:28:19.598753 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zmzds" Nov 22 03:28:20 crc kubenswrapper[4952]: I1122 03:28:20.066352 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmzds"] Nov 22 03:28:20 crc kubenswrapper[4952]: I1122 03:28:20.866654 4952 generic.go:334] "Generic (PLEG): container finished" podID="daa31f84-2af2-42c5-95c6-bb045cb8298d" containerID="97dfc2298e4da1b346e0ad5668766465d85e40a699d3cc46259c866f4982f108" exitCode=0 Nov 22 03:28:20 crc kubenswrapper[4952]: I1122 03:28:20.866745 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmzds" event={"ID":"daa31f84-2af2-42c5-95c6-bb045cb8298d","Type":"ContainerDied","Data":"97dfc2298e4da1b346e0ad5668766465d85e40a699d3cc46259c866f4982f108"} Nov 22 03:28:20 crc kubenswrapper[4952]: I1122 03:28:20.866983 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmzds" event={"ID":"daa31f84-2af2-42c5-95c6-bb045cb8298d","Type":"ContainerStarted","Data":"ad5e4b18e8f1e45f2c9f0b9f76ed11fc7ad5ea6a55f9a7efaffb9ab358fd17ae"} Nov 22 03:28:20 crc kubenswrapper[4952]: I1122 03:28:20.869231 4952 generic.go:334] "Generic (PLEG): container finished" podID="1fd99427-479c-43b6-8019-c0064cd9eadb" containerID="797e34d99df19acc71dfdce4c9f31b3c4de7f8a170e75409143f3153b0cb8b16" exitCode=0 Nov 22 03:28:20 crc kubenswrapper[4952]: I1122 03:28:20.869292 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhzjn" event={"ID":"1fd99427-479c-43b6-8019-c0064cd9eadb","Type":"ContainerDied","Data":"797e34d99df19acc71dfdce4c9f31b3c4de7f8a170e75409143f3153b0cb8b16"} Nov 22 03:28:20 crc kubenswrapper[4952]: I1122 03:28:20.872159 4952 generic.go:334] "Generic (PLEG): container finished" podID="7860fb18-cf1c-4439-a4e4-0d17de2afa76" containerID="3b34b0ff559fbb1f363e5c8c371fad2a3362820151136b5638e335f2a2a592f6" exitCode=0 Nov 22 03:28:20 crc kubenswrapper[4952]: I1122 03:28:20.872186 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfcnd" event={"ID":"7860fb18-cf1c-4439-a4e4-0d17de2afa76","Type":"ContainerDied","Data":"3b34b0ff559fbb1f363e5c8c371fad2a3362820151136b5638e335f2a2a592f6"} Nov 22 03:28:21 crc kubenswrapper[4952]: I1122 03:28:21.882806 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmzds" event={"ID":"daa31f84-2af2-42c5-95c6-bb045cb8298d","Type":"ContainerStarted","Data":"efc69d38f0c760cfba75452b1b2089eeed18d76f903022ebf8cd162da6f8a65b"} Nov 22 03:28:21 crc kubenswrapper[4952]: I1122 03:28:21.885864 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhzjn" event={"ID":"1fd99427-479c-43b6-8019-c0064cd9eadb","Type":"ContainerStarted","Data":"d39ceee06eb56b95099c4b5dbbae1a64d6b4e82e7ff9c5b3ab369b6b023547be"} Nov 22 03:28:21 crc kubenswrapper[4952]: I1122 03:28:21.888826 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfcnd" event={"ID":"7860fb18-cf1c-4439-a4e4-0d17de2afa76","Type":"ContainerStarted","Data":"59766e007712f6f1a56e45914165012217bd1f0400263d31153ddd121abc8a4f"} Nov 22 03:28:21 crc kubenswrapper[4952]: I1122 03:28:21.963392 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bhzjn" podStartSLOduration=3.558094406 podStartE2EDuration="5.963372986s" podCreationTimestamp="2025-11-22 03:28:16 +0000 UTC" firstStartedPulling="2025-11-22 03:28:18.842796275 +0000 UTC m=+2063.148813588" lastFinishedPulling="2025-11-22 03:28:21.248074855 +0000 UTC m=+2065.554092168" observedRunningTime="2025-11-22 03:28:21.937258154 +0000 UTC m=+2066.243275457" watchObservedRunningTime="2025-11-22 03:28:21.963372986 +0000 UTC m=+2066.269390259" Nov 22 03:28:22 crc kubenswrapper[4952]: I1122 03:28:22.902074 4952 generic.go:334] "Generic (PLEG): container finished" podID="daa31f84-2af2-42c5-95c6-bb045cb8298d" containerID="efc69d38f0c760cfba75452b1b2089eeed18d76f903022ebf8cd162da6f8a65b" exitCode=0 Nov 22 03:28:22 crc kubenswrapper[4952]: I1122 03:28:22.902238 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmzds" event={"ID":"daa31f84-2af2-42c5-95c6-bb045cb8298d","Type":"ContainerDied","Data":"efc69d38f0c760cfba75452b1b2089eeed18d76f903022ebf8cd162da6f8a65b"} Nov 22 03:28:22 crc kubenswrapper[4952]: I1122 03:28:22.930884 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zfcnd" podStartSLOduration=4.518505742 podStartE2EDuration="6.93085636s" podCreationTimestamp="2025-11-22 03:28:16 +0000 UTC" firstStartedPulling="2025-11-22 03:28:18.847842098 +0000 UTC m=+2063.153859391" lastFinishedPulling="2025-11-22 03:28:21.260192726 +0000 UTC m=+2065.566210009" observedRunningTime="2025-11-22 03:28:21.96388388 +0000 UTC m=+2066.269901153" watchObservedRunningTime="2025-11-22 03:28:22.93085636 +0000 UTC m=+2067.236873653" Nov 22 03:28:24 crc kubenswrapper[4952]: I1122 03:28:24.928902 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmzds" event={"ID":"daa31f84-2af2-42c5-95c6-bb045cb8298d","Type":"ContainerStarted","Data":"6d09a40b7504001c0a680e83c8457ef8e5a1e61b4b74a3b33eed798d9ad7effc"} Nov 22 03:28:24 crc kubenswrapper[4952]: I1122 03:28:24.960885 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zmzds" podStartSLOduration=2.997067259 podStartE2EDuration="5.960837781s" podCreationTimestamp="2025-11-22 03:28:19 +0000 UTC" firstStartedPulling="2025-11-22 03:28:20.869729423 +0000 UTC m=+2065.175746696" lastFinishedPulling="2025-11-22 03:28:23.833499935 +0000 UTC m=+2068.139517218" observedRunningTime="2025-11-22 03:28:24.953226849 +0000 UTC m=+2069.259244162" watchObservedRunningTime="2025-11-22 03:28:24.960837781 +0000 UTC m=+2069.266855064" Nov 22 03:28:27 crc kubenswrapper[4952]: I1122 03:28:27.163920 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bhzjn" Nov 22 03:28:27 crc kubenswrapper[4952]: I1122 03:28:27.164282 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bhzjn" Nov 22 03:28:27 crc kubenswrapper[4952]: I1122 03:28:27.257977 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bhzjn" Nov 22 03:28:27 crc kubenswrapper[4952]: I1122 03:28:27.342191 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zfcnd" Nov 22 03:28:27 crc kubenswrapper[4952]: I1122 03:28:27.342709 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zfcnd" Nov 22 03:28:27 crc kubenswrapper[4952]: I1122 03:28:27.409655 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zfcnd" Nov 22 03:28:28 crc kubenswrapper[4952]: I1122 03:28:28.050411 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bhzjn" Nov 22 03:28:28 crc kubenswrapper[4952]: I1122 03:28:28.074975 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zfcnd" Nov 22 03:28:29 crc kubenswrapper[4952]: I1122 03:28:29.599428 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zmzds" Nov 22 03:28:29 crc kubenswrapper[4952]: I1122 03:28:29.599860 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zmzds" Nov 22 03:28:29 crc kubenswrapper[4952]: I1122 03:28:29.608023 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bhzjn"] Nov 22 03:28:29 crc kubenswrapper[4952]: I1122 03:28:29.655178 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zmzds" Nov 22 03:28:29 crc kubenswrapper[4952]: I1122 03:28:29.988875 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bhzjn" podUID="1fd99427-479c-43b6-8019-c0064cd9eadb" containerName="registry-server" containerID="cri-o://d39ceee06eb56b95099c4b5dbbae1a64d6b4e82e7ff9c5b3ab369b6b023547be" gracePeriod=2 Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.075594 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zmzds" Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.201366 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zfcnd"] Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.201955 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zfcnd" podUID="7860fb18-cf1c-4439-a4e4-0d17de2afa76" containerName="registry-server" containerID="cri-o://59766e007712f6f1a56e45914165012217bd1f0400263d31153ddd121abc8a4f" gracePeriod=2 Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.557401 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhzjn" Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.658007 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zfcnd" Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.722035 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd99427-479c-43b6-8019-c0064cd9eadb-catalog-content\") pod \"1fd99427-479c-43b6-8019-c0064cd9eadb\" (UID: \"1fd99427-479c-43b6-8019-c0064cd9eadb\") " Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.722074 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc2rm\" (UniqueName: \"kubernetes.io/projected/1fd99427-479c-43b6-8019-c0064cd9eadb-kube-api-access-rc2rm\") pod \"1fd99427-479c-43b6-8019-c0064cd9eadb\" (UID: \"1fd99427-479c-43b6-8019-c0064cd9eadb\") " Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.722231 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd99427-479c-43b6-8019-c0064cd9eadb-utilities\") pod \"1fd99427-479c-43b6-8019-c0064cd9eadb\" (UID: \"1fd99427-479c-43b6-8019-c0064cd9eadb\") " Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.727296 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fd99427-479c-43b6-8019-c0064cd9eadb-utilities" (OuterVolumeSpecName: "utilities") pod "1fd99427-479c-43b6-8019-c0064cd9eadb" (UID: "1fd99427-479c-43b6-8019-c0064cd9eadb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.732326 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fd99427-479c-43b6-8019-c0064cd9eadb-kube-api-access-rc2rm" (OuterVolumeSpecName: "kube-api-access-rc2rm") pod "1fd99427-479c-43b6-8019-c0064cd9eadb" (UID: "1fd99427-479c-43b6-8019-c0064cd9eadb"). InnerVolumeSpecName "kube-api-access-rc2rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.766910 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fd99427-479c-43b6-8019-c0064cd9eadb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1fd99427-479c-43b6-8019-c0064cd9eadb" (UID: "1fd99427-479c-43b6-8019-c0064cd9eadb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.823785 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7860fb18-cf1c-4439-a4e4-0d17de2afa76-utilities\") pod \"7860fb18-cf1c-4439-a4e4-0d17de2afa76\" (UID: \"7860fb18-cf1c-4439-a4e4-0d17de2afa76\") " Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.824137 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfmxg\" (UniqueName: \"kubernetes.io/projected/7860fb18-cf1c-4439-a4e4-0d17de2afa76-kube-api-access-kfmxg\") pod \"7860fb18-cf1c-4439-a4e4-0d17de2afa76\" (UID: \"7860fb18-cf1c-4439-a4e4-0d17de2afa76\") " Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.824172 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7860fb18-cf1c-4439-a4e4-0d17de2afa76-catalog-content\") pod \"7860fb18-cf1c-4439-a4e4-0d17de2afa76\" (UID: \"7860fb18-cf1c-4439-a4e4-0d17de2afa76\") " Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.825066 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7860fb18-cf1c-4439-a4e4-0d17de2afa76-utilities" (OuterVolumeSpecName: "utilities") pod "7860fb18-cf1c-4439-a4e4-0d17de2afa76" (UID: "7860fb18-cf1c-4439-a4e4-0d17de2afa76"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.828058 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7860fb18-cf1c-4439-a4e4-0d17de2afa76-kube-api-access-kfmxg" (OuterVolumeSpecName: "kube-api-access-kfmxg") pod "7860fb18-cf1c-4439-a4e4-0d17de2afa76" (UID: "7860fb18-cf1c-4439-a4e4-0d17de2afa76"). InnerVolumeSpecName "kube-api-access-kfmxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.828779 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd99427-479c-43b6-8019-c0064cd9eadb-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.828799 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc2rm\" (UniqueName: \"kubernetes.io/projected/1fd99427-479c-43b6-8019-c0064cd9eadb-kube-api-access-rc2rm\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.828812 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7860fb18-cf1c-4439-a4e4-0d17de2afa76-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.828820 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfmxg\" (UniqueName: \"kubernetes.io/projected/7860fb18-cf1c-4439-a4e4-0d17de2afa76-kube-api-access-kfmxg\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.828828 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd99427-479c-43b6-8019-c0064cd9eadb-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.882336 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7860fb18-cf1c-4439-a4e4-0d17de2afa76-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7860fb18-cf1c-4439-a4e4-0d17de2afa76" (UID: "7860fb18-cf1c-4439-a4e4-0d17de2afa76"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:28:30 crc kubenswrapper[4952]: I1122 03:28:30.930810 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7860fb18-cf1c-4439-a4e4-0d17de2afa76-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.012907 4952 generic.go:334] "Generic (PLEG): container finished" podID="7860fb18-cf1c-4439-a4e4-0d17de2afa76" containerID="59766e007712f6f1a56e45914165012217bd1f0400263d31153ddd121abc8a4f" exitCode=0 Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.013020 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfcnd" event={"ID":"7860fb18-cf1c-4439-a4e4-0d17de2afa76","Type":"ContainerDied","Data":"59766e007712f6f1a56e45914165012217bd1f0400263d31153ddd121abc8a4f"} Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.013036 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zfcnd" Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.013065 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfcnd" event={"ID":"7860fb18-cf1c-4439-a4e4-0d17de2afa76","Type":"ContainerDied","Data":"43d311e295db2c5a790c3c380042a836194d7ad59a666609e886d40975c49211"} Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.013109 4952 scope.go:117] "RemoveContainer" containerID="59766e007712f6f1a56e45914165012217bd1f0400263d31153ddd121abc8a4f" Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.019107 4952 generic.go:334] "Generic (PLEG): container finished" podID="1fd99427-479c-43b6-8019-c0064cd9eadb" containerID="d39ceee06eb56b95099c4b5dbbae1a64d6b4e82e7ff9c5b3ab369b6b023547be" exitCode=0 Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.019186 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhzjn" Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.019186 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhzjn" event={"ID":"1fd99427-479c-43b6-8019-c0064cd9eadb","Type":"ContainerDied","Data":"d39ceee06eb56b95099c4b5dbbae1a64d6b4e82e7ff9c5b3ab369b6b023547be"} Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.019253 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhzjn" event={"ID":"1fd99427-479c-43b6-8019-c0064cd9eadb","Type":"ContainerDied","Data":"f3f62957d2835b78b11005401ea46124f17e02837860b5f5c4f4c2b0594d73ab"} Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.045777 4952 scope.go:117] "RemoveContainer" containerID="3b34b0ff559fbb1f363e5c8c371fad2a3362820151136b5638e335f2a2a592f6" Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.083033 4952 scope.go:117] "RemoveContainer" containerID="8a71a8621fa9da8893003a78342972ed9a5bbc9a5f5542161d3f647ce030b097" Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.083608 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zfcnd"] Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.083704 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zfcnd"] Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.089915 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bhzjn"] Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.104814 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bhzjn"] Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.113420 4952 scope.go:117] "RemoveContainer" containerID="59766e007712f6f1a56e45914165012217bd1f0400263d31153ddd121abc8a4f" Nov 22 03:28:31 crc kubenswrapper[4952]: E1122 03:28:31.115710 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59766e007712f6f1a56e45914165012217bd1f0400263d31153ddd121abc8a4f\": container with ID starting with 59766e007712f6f1a56e45914165012217bd1f0400263d31153ddd121abc8a4f not found: ID does not exist" containerID="59766e007712f6f1a56e45914165012217bd1f0400263d31153ddd121abc8a4f" Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.115825 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59766e007712f6f1a56e45914165012217bd1f0400263d31153ddd121abc8a4f"} err="failed to get container status \"59766e007712f6f1a56e45914165012217bd1f0400263d31153ddd121abc8a4f\": rpc error: code = NotFound desc = could not find container \"59766e007712f6f1a56e45914165012217bd1f0400263d31153ddd121abc8a4f\": container with ID starting with 59766e007712f6f1a56e45914165012217bd1f0400263d31153ddd121abc8a4f not found: ID does not exist" Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.115879 4952 scope.go:117] "RemoveContainer" containerID="3b34b0ff559fbb1f363e5c8c371fad2a3362820151136b5638e335f2a2a592f6" Nov 22 03:28:31 crc kubenswrapper[4952]: E1122 03:28:31.118633 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b34b0ff559fbb1f363e5c8c371fad2a3362820151136b5638e335f2a2a592f6\": container with ID starting with 3b34b0ff559fbb1f363e5c8c371fad2a3362820151136b5638e335f2a2a592f6 not found: ID does not exist" containerID="3b34b0ff559fbb1f363e5c8c371fad2a3362820151136b5638e335f2a2a592f6" Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.118673 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b34b0ff559fbb1f363e5c8c371fad2a3362820151136b5638e335f2a2a592f6"} err="failed to get container status \"3b34b0ff559fbb1f363e5c8c371fad2a3362820151136b5638e335f2a2a592f6\": rpc error: code = NotFound desc = could not find container \"3b34b0ff559fbb1f363e5c8c371fad2a3362820151136b5638e335f2a2a592f6\": container with ID starting with 3b34b0ff559fbb1f363e5c8c371fad2a3362820151136b5638e335f2a2a592f6 not found: ID does not exist" Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.118704 4952 scope.go:117] "RemoveContainer" containerID="8a71a8621fa9da8893003a78342972ed9a5bbc9a5f5542161d3f647ce030b097" Nov 22 03:28:31 crc kubenswrapper[4952]: E1122 03:28:31.120020 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a71a8621fa9da8893003a78342972ed9a5bbc9a5f5542161d3f647ce030b097\": container with ID starting with 8a71a8621fa9da8893003a78342972ed9a5bbc9a5f5542161d3f647ce030b097 not found: ID does not exist" containerID="8a71a8621fa9da8893003a78342972ed9a5bbc9a5f5542161d3f647ce030b097" Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.120080 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a71a8621fa9da8893003a78342972ed9a5bbc9a5f5542161d3f647ce030b097"} err="failed to get container status \"8a71a8621fa9da8893003a78342972ed9a5bbc9a5f5542161d3f647ce030b097\": rpc error: code = NotFound desc = could not find container \"8a71a8621fa9da8893003a78342972ed9a5bbc9a5f5542161d3f647ce030b097\": container with ID starting with 8a71a8621fa9da8893003a78342972ed9a5bbc9a5f5542161d3f647ce030b097 not found: ID does not exist" Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.120114 4952 scope.go:117] "RemoveContainer" containerID="d39ceee06eb56b95099c4b5dbbae1a64d6b4e82e7ff9c5b3ab369b6b023547be" Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.165451 4952 scope.go:117] "RemoveContainer" containerID="797e34d99df19acc71dfdce4c9f31b3c4de7f8a170e75409143f3153b0cb8b16" Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.188070 4952 scope.go:117] "RemoveContainer" containerID="629c1014ae88abd7897c3af2e53dc7ee469e64cb2c19e960322b165e4c8305f5" Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.230245 4952 scope.go:117] "RemoveContainer" containerID="d39ceee06eb56b95099c4b5dbbae1a64d6b4e82e7ff9c5b3ab369b6b023547be" Nov 22 03:28:31 crc kubenswrapper[4952]: E1122 03:28:31.230882 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d39ceee06eb56b95099c4b5dbbae1a64d6b4e82e7ff9c5b3ab369b6b023547be\": container with ID starting with d39ceee06eb56b95099c4b5dbbae1a64d6b4e82e7ff9c5b3ab369b6b023547be not found: ID does not exist" containerID="d39ceee06eb56b95099c4b5dbbae1a64d6b4e82e7ff9c5b3ab369b6b023547be" Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.230937 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d39ceee06eb56b95099c4b5dbbae1a64d6b4e82e7ff9c5b3ab369b6b023547be"} err="failed to get container status \"d39ceee06eb56b95099c4b5dbbae1a64d6b4e82e7ff9c5b3ab369b6b023547be\": rpc error: code = NotFound desc = could not find container \"d39ceee06eb56b95099c4b5dbbae1a64d6b4e82e7ff9c5b3ab369b6b023547be\": container with ID starting with d39ceee06eb56b95099c4b5dbbae1a64d6b4e82e7ff9c5b3ab369b6b023547be not found: ID does not exist" Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.230966 4952 scope.go:117] "RemoveContainer" containerID="797e34d99df19acc71dfdce4c9f31b3c4de7f8a170e75409143f3153b0cb8b16" Nov 22 03:28:31 crc kubenswrapper[4952]: E1122 03:28:31.231339 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"797e34d99df19acc71dfdce4c9f31b3c4de7f8a170e75409143f3153b0cb8b16\": container with ID starting with 797e34d99df19acc71dfdce4c9f31b3c4de7f8a170e75409143f3153b0cb8b16 not found: ID does not exist" containerID="797e34d99df19acc71dfdce4c9f31b3c4de7f8a170e75409143f3153b0cb8b16" Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.231361 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"797e34d99df19acc71dfdce4c9f31b3c4de7f8a170e75409143f3153b0cb8b16"} err="failed to get container status \"797e34d99df19acc71dfdce4c9f31b3c4de7f8a170e75409143f3153b0cb8b16\": rpc error: code = NotFound desc = could not find container \"797e34d99df19acc71dfdce4c9f31b3c4de7f8a170e75409143f3153b0cb8b16\": container with ID starting with 797e34d99df19acc71dfdce4c9f31b3c4de7f8a170e75409143f3153b0cb8b16 not found: ID does not exist" Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.231382 4952 scope.go:117] "RemoveContainer" containerID="629c1014ae88abd7897c3af2e53dc7ee469e64cb2c19e960322b165e4c8305f5" Nov 22 03:28:31 crc kubenswrapper[4952]: E1122 03:28:31.231756 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"629c1014ae88abd7897c3af2e53dc7ee469e64cb2c19e960322b165e4c8305f5\": container with ID starting with 629c1014ae88abd7897c3af2e53dc7ee469e64cb2c19e960322b165e4c8305f5 not found: ID does not exist" containerID="629c1014ae88abd7897c3af2e53dc7ee469e64cb2c19e960322b165e4c8305f5" Nov 22 03:28:31 crc kubenswrapper[4952]: I1122 03:28:31.231782 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"629c1014ae88abd7897c3af2e53dc7ee469e64cb2c19e960322b165e4c8305f5"} err="failed to get container status \"629c1014ae88abd7897c3af2e53dc7ee469e64cb2c19e960322b165e4c8305f5\": rpc error: code = NotFound desc = could not find container \"629c1014ae88abd7897c3af2e53dc7ee469e64cb2c19e960322b165e4c8305f5\": container with ID starting with 629c1014ae88abd7897c3af2e53dc7ee469e64cb2c19e960322b165e4c8305f5 not found: ID does not exist" Nov 22 03:28:32 crc kubenswrapper[4952]: I1122 03:28:32.549904 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fd99427-479c-43b6-8019-c0064cd9eadb" path="/var/lib/kubelet/pods/1fd99427-479c-43b6-8019-c0064cd9eadb/volumes" Nov 22 03:28:32 crc kubenswrapper[4952]: I1122 03:28:32.552228 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7860fb18-cf1c-4439-a4e4-0d17de2afa76" path="/var/lib/kubelet/pods/7860fb18-cf1c-4439-a4e4-0d17de2afa76/volumes" Nov 22 03:28:32 crc kubenswrapper[4952]: I1122 03:28:32.610016 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmzds"] Nov 22 03:28:32 crc kubenswrapper[4952]: I1122 03:28:32.610465 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zmzds" podUID="daa31f84-2af2-42c5-95c6-bb045cb8298d" containerName="registry-server" containerID="cri-o://6d09a40b7504001c0a680e83c8457ef8e5a1e61b4b74a3b33eed798d9ad7effc" gracePeriod=2 Nov 22 03:28:33 crc kubenswrapper[4952]: I1122 03:28:33.076251 4952 generic.go:334] "Generic (PLEG): container finished" podID="daa31f84-2af2-42c5-95c6-bb045cb8298d" containerID="6d09a40b7504001c0a680e83c8457ef8e5a1e61b4b74a3b33eed798d9ad7effc" exitCode=0 Nov 22 03:28:33 crc kubenswrapper[4952]: I1122 03:28:33.076306 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmzds" event={"ID":"daa31f84-2af2-42c5-95c6-bb045cb8298d","Type":"ContainerDied","Data":"6d09a40b7504001c0a680e83c8457ef8e5a1e61b4b74a3b33eed798d9ad7effc"} Nov 22 03:28:33 crc kubenswrapper[4952]: I1122 03:28:33.076682 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmzds" event={"ID":"daa31f84-2af2-42c5-95c6-bb045cb8298d","Type":"ContainerDied","Data":"ad5e4b18e8f1e45f2c9f0b9f76ed11fc7ad5ea6a55f9a7efaffb9ab358fd17ae"} Nov 22 03:28:33 crc kubenswrapper[4952]: I1122 03:28:33.076700 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad5e4b18e8f1e45f2c9f0b9f76ed11fc7ad5ea6a55f9a7efaffb9ab358fd17ae" Nov 22 03:28:33 crc kubenswrapper[4952]: I1122 03:28:33.145715 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zmzds" Nov 22 03:28:33 crc kubenswrapper[4952]: I1122 03:28:33.274810 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lnrc\" (UniqueName: \"kubernetes.io/projected/daa31f84-2af2-42c5-95c6-bb045cb8298d-kube-api-access-6lnrc\") pod \"daa31f84-2af2-42c5-95c6-bb045cb8298d\" (UID: \"daa31f84-2af2-42c5-95c6-bb045cb8298d\") " Nov 22 03:28:33 crc kubenswrapper[4952]: I1122 03:28:33.275067 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daa31f84-2af2-42c5-95c6-bb045cb8298d-catalog-content\") pod \"daa31f84-2af2-42c5-95c6-bb045cb8298d\" (UID: \"daa31f84-2af2-42c5-95c6-bb045cb8298d\") " Nov 22 03:28:33 crc kubenswrapper[4952]: I1122 03:28:33.275141 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daa31f84-2af2-42c5-95c6-bb045cb8298d-utilities\") pod \"daa31f84-2af2-42c5-95c6-bb045cb8298d\" (UID: \"daa31f84-2af2-42c5-95c6-bb045cb8298d\") " Nov 22 03:28:33 crc kubenswrapper[4952]: I1122 03:28:33.276304 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daa31f84-2af2-42c5-95c6-bb045cb8298d-utilities" (OuterVolumeSpecName: "utilities") pod "daa31f84-2af2-42c5-95c6-bb045cb8298d" (UID: "daa31f84-2af2-42c5-95c6-bb045cb8298d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:28:33 crc kubenswrapper[4952]: I1122 03:28:33.281659 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daa31f84-2af2-42c5-95c6-bb045cb8298d-kube-api-access-6lnrc" (OuterVolumeSpecName: "kube-api-access-6lnrc") pod "daa31f84-2af2-42c5-95c6-bb045cb8298d" (UID: "daa31f84-2af2-42c5-95c6-bb045cb8298d"). InnerVolumeSpecName "kube-api-access-6lnrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:28:33 crc kubenswrapper[4952]: I1122 03:28:33.314664 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daa31f84-2af2-42c5-95c6-bb045cb8298d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "daa31f84-2af2-42c5-95c6-bb045cb8298d" (UID: "daa31f84-2af2-42c5-95c6-bb045cb8298d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:28:33 crc kubenswrapper[4952]: I1122 03:28:33.377584 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lnrc\" (UniqueName: \"kubernetes.io/projected/daa31f84-2af2-42c5-95c6-bb045cb8298d-kube-api-access-6lnrc\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:33 crc kubenswrapper[4952]: I1122 03:28:33.377648 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daa31f84-2af2-42c5-95c6-bb045cb8298d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:33 crc kubenswrapper[4952]: I1122 03:28:33.377684 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daa31f84-2af2-42c5-95c6-bb045cb8298d-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:34 crc kubenswrapper[4952]: I1122 03:28:34.087623 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zmzds" Nov 22 03:28:34 crc kubenswrapper[4952]: I1122 03:28:34.145497 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmzds"] Nov 22 03:28:34 crc kubenswrapper[4952]: I1122 03:28:34.158104 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmzds"] Nov 22 03:28:34 crc kubenswrapper[4952]: I1122 03:28:34.552346 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daa31f84-2af2-42c5-95c6-bb045cb8298d" path="/var/lib/kubelet/pods/daa31f84-2af2-42c5-95c6-bb045cb8298d/volumes" Nov 22 03:29:01 crc kubenswrapper[4952]: I1122 03:29:01.763390 4952 scope.go:117] "RemoveContainer" containerID="4ff939dfdbad4b4f2789e1dd4382cec0566e01743e1560320da2a3d75964cfbb" Nov 22 03:29:01 crc kubenswrapper[4952]: I1122 03:29:01.823048 4952 scope.go:117] "RemoveContainer" containerID="c945fc663caf63f76f7a57f9177dbff81a91573f5f0306765bc1f13560431d77" Nov 22 03:29:01 crc kubenswrapper[4952]: I1122 03:29:01.858786 4952 scope.go:117] "RemoveContainer" containerID="99724f15c2d95a86da8115847f17b92deff344e43d6a05437e293e4b600f3e61" Nov 22 03:29:28 crc kubenswrapper[4952]: I1122 03:29:28.342530 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:29:28 crc kubenswrapper[4952]: I1122 03:29:28.343317 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:29:50 crc kubenswrapper[4952]: I1122 03:29:50.901904 4952 generic.go:334] "Generic (PLEG): container finished" podID="10f9b191-e7da-494f-b29d-b0594d9044c2" containerID="743747fdf5005373862c36392afee960aab1d9f89a9433c518c15210f7f29727" exitCode=0 Nov 22 03:29:50 crc kubenswrapper[4952]: I1122 03:29:50.902053 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" event={"ID":"10f9b191-e7da-494f-b29d-b0594d9044c2","Type":"ContainerDied","Data":"743747fdf5005373862c36392afee960aab1d9f89a9433c518c15210f7f29727"} Nov 22 03:29:52 crc kubenswrapper[4952]: I1122 03:29:52.406539 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" Nov 22 03:29:52 crc kubenswrapper[4952]: I1122 03:29:52.503961 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cntsx\" (UniqueName: \"kubernetes.io/projected/10f9b191-e7da-494f-b29d-b0594d9044c2-kube-api-access-cntsx\") pod \"10f9b191-e7da-494f-b29d-b0594d9044c2\" (UID: \"10f9b191-e7da-494f-b29d-b0594d9044c2\") " Nov 22 03:29:52 crc kubenswrapper[4952]: I1122 03:29:52.504023 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-ssh-key\") pod \"10f9b191-e7da-494f-b29d-b0594d9044c2\" (UID: \"10f9b191-e7da-494f-b29d-b0594d9044c2\") " Nov 22 03:29:52 crc kubenswrapper[4952]: I1122 03:29:52.504066 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-bootstrap-combined-ca-bundle\") pod \"10f9b191-e7da-494f-b29d-b0594d9044c2\" (UID: \"10f9b191-e7da-494f-b29d-b0594d9044c2\") " Nov 22 03:29:52 crc kubenswrapper[4952]: I1122 03:29:52.504148 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-ceph\") pod \"10f9b191-e7da-494f-b29d-b0594d9044c2\" (UID: \"10f9b191-e7da-494f-b29d-b0594d9044c2\") " Nov 22 03:29:52 crc kubenswrapper[4952]: I1122 03:29:52.504194 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-inventory\") pod \"10f9b191-e7da-494f-b29d-b0594d9044c2\" (UID: \"10f9b191-e7da-494f-b29d-b0594d9044c2\") " Nov 22 03:29:52 crc kubenswrapper[4952]: I1122 03:29:52.511435 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-ceph" (OuterVolumeSpecName: "ceph") pod "10f9b191-e7da-494f-b29d-b0594d9044c2" (UID: "10f9b191-e7da-494f-b29d-b0594d9044c2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:29:52 crc kubenswrapper[4952]: I1122 03:29:52.518360 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10f9b191-e7da-494f-b29d-b0594d9044c2-kube-api-access-cntsx" (OuterVolumeSpecName: "kube-api-access-cntsx") pod "10f9b191-e7da-494f-b29d-b0594d9044c2" (UID: "10f9b191-e7da-494f-b29d-b0594d9044c2"). InnerVolumeSpecName "kube-api-access-cntsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:29:52 crc kubenswrapper[4952]: I1122 03:29:52.518754 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "10f9b191-e7da-494f-b29d-b0594d9044c2" (UID: "10f9b191-e7da-494f-b29d-b0594d9044c2"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:29:52 crc kubenswrapper[4952]: I1122 03:29:52.546026 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "10f9b191-e7da-494f-b29d-b0594d9044c2" (UID: "10f9b191-e7da-494f-b29d-b0594d9044c2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:29:52 crc kubenswrapper[4952]: I1122 03:29:52.551353 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-inventory" (OuterVolumeSpecName: "inventory") pod "10f9b191-e7da-494f-b29d-b0594d9044c2" (UID: "10f9b191-e7da-494f-b29d-b0594d9044c2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:29:52 crc kubenswrapper[4952]: I1122 03:29:52.606947 4952 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:29:52 crc kubenswrapper[4952]: I1122 03:29:52.606997 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:29:52 crc kubenswrapper[4952]: I1122 03:29:52.607020 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cntsx\" (UniqueName: \"kubernetes.io/projected/10f9b191-e7da-494f-b29d-b0594d9044c2-kube-api-access-cntsx\") on node \"crc\" DevicePath \"\"" Nov 22 03:29:52 crc kubenswrapper[4952]: I1122 03:29:52.607037 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:29:52 crc kubenswrapper[4952]: I1122 03:29:52.607054 4952 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f9b191-e7da-494f-b29d-b0594d9044c2-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:29:52 crc kubenswrapper[4952]: I1122 03:29:52.925521 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" event={"ID":"10f9b191-e7da-494f-b29d-b0594d9044c2","Type":"ContainerDied","Data":"9606e85d0fabe846028f6b55925beb2093b47195937cba4ed0593bb678433fa1"} Nov 22 03:29:52 crc kubenswrapper[4952]: I1122 03:29:52.925592 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq" Nov 22 03:29:52 crc kubenswrapper[4952]: I1122 03:29:52.925610 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9606e85d0fabe846028f6b55925beb2093b47195937cba4ed0593bb678433fa1" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.051257 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph"] Nov 22 03:29:53 crc kubenswrapper[4952]: E1122 03:29:53.051703 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7860fb18-cf1c-4439-a4e4-0d17de2afa76" containerName="extract-utilities" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.051727 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="7860fb18-cf1c-4439-a4e4-0d17de2afa76" containerName="extract-utilities" Nov 22 03:29:53 crc kubenswrapper[4952]: E1122 03:29:53.051741 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daa31f84-2af2-42c5-95c6-bb045cb8298d" containerName="extract-content" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.051751 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="daa31f84-2af2-42c5-95c6-bb045cb8298d" containerName="extract-content" Nov 22 03:29:53 crc kubenswrapper[4952]: E1122 03:29:53.051768 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f9b191-e7da-494f-b29d-b0594d9044c2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.051779 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f9b191-e7da-494f-b29d-b0594d9044c2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 22 03:29:53 crc kubenswrapper[4952]: E1122 03:29:53.051788 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd99427-479c-43b6-8019-c0064cd9eadb" containerName="extract-content" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.051795 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd99427-479c-43b6-8019-c0064cd9eadb" containerName="extract-content" Nov 22 03:29:53 crc kubenswrapper[4952]: E1122 03:29:53.051814 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7860fb18-cf1c-4439-a4e4-0d17de2afa76" containerName="registry-server" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.051822 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="7860fb18-cf1c-4439-a4e4-0d17de2afa76" containerName="registry-server" Nov 22 03:29:53 crc kubenswrapper[4952]: E1122 03:29:53.051849 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd99427-479c-43b6-8019-c0064cd9eadb" containerName="extract-utilities" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.051858 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd99427-479c-43b6-8019-c0064cd9eadb" containerName="extract-utilities" Nov 22 03:29:53 crc kubenswrapper[4952]: E1122 03:29:53.051872 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd99427-479c-43b6-8019-c0064cd9eadb" containerName="registry-server" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.051880 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd99427-479c-43b6-8019-c0064cd9eadb" containerName="registry-server" Nov 22 03:29:53 crc kubenswrapper[4952]: E1122 03:29:53.051894 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7860fb18-cf1c-4439-a4e4-0d17de2afa76" containerName="extract-content" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.051903 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="7860fb18-cf1c-4439-a4e4-0d17de2afa76" containerName="extract-content" Nov 22 03:29:53 crc kubenswrapper[4952]: E1122 03:29:53.051919 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daa31f84-2af2-42c5-95c6-bb045cb8298d" containerName="registry-server" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.051927 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="daa31f84-2af2-42c5-95c6-bb045cb8298d" containerName="registry-server" Nov 22 03:29:53 crc kubenswrapper[4952]: E1122 03:29:53.051938 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daa31f84-2af2-42c5-95c6-bb045cb8298d" containerName="extract-utilities" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.051946 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="daa31f84-2af2-42c5-95c6-bb045cb8298d" containerName="extract-utilities" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.052140 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="10f9b191-e7da-494f-b29d-b0594d9044c2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.052163 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fd99427-479c-43b6-8019-c0064cd9eadb" containerName="registry-server" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.052184 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="7860fb18-cf1c-4439-a4e4-0d17de2afa76" containerName="registry-server" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.052196 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="daa31f84-2af2-42c5-95c6-bb045cb8298d" containerName="registry-server" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.056295 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.059219 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.059765 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.065607 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.066057 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.066466 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.073611 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph"] Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.116334 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4a56c774-0c51-4378-bfce-81bb5481f736-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g64ph\" (UID: \"4a56c774-0c51-4378-bfce-81bb5481f736\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.116444 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhn57\" (UniqueName: \"kubernetes.io/projected/4a56c774-0c51-4378-bfce-81bb5481f736-kube-api-access-qhn57\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g64ph\" (UID: \"4a56c774-0c51-4378-bfce-81bb5481f736\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.116779 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4a56c774-0c51-4378-bfce-81bb5481f736-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g64ph\" (UID: \"4a56c774-0c51-4378-bfce-81bb5481f736\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.116907 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a56c774-0c51-4378-bfce-81bb5481f736-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g64ph\" (UID: \"4a56c774-0c51-4378-bfce-81bb5481f736\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.218588 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4a56c774-0c51-4378-bfce-81bb5481f736-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g64ph\" (UID: \"4a56c774-0c51-4378-bfce-81bb5481f736\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.218667 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a56c774-0c51-4378-bfce-81bb5481f736-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g64ph\" (UID: \"4a56c774-0c51-4378-bfce-81bb5481f736\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.218773 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4a56c774-0c51-4378-bfce-81bb5481f736-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g64ph\" (UID: \"4a56c774-0c51-4378-bfce-81bb5481f736\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.218823 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhn57\" (UniqueName: \"kubernetes.io/projected/4a56c774-0c51-4378-bfce-81bb5481f736-kube-api-access-qhn57\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g64ph\" (UID: \"4a56c774-0c51-4378-bfce-81bb5481f736\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.223466 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a56c774-0c51-4378-bfce-81bb5481f736-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g64ph\" (UID: \"4a56c774-0c51-4378-bfce-81bb5481f736\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.223488 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4a56c774-0c51-4378-bfce-81bb5481f736-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g64ph\" (UID: \"4a56c774-0c51-4378-bfce-81bb5481f736\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.225398 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4a56c774-0c51-4378-bfce-81bb5481f736-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g64ph\" (UID: \"4a56c774-0c51-4378-bfce-81bb5481f736\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.256472 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhn57\" (UniqueName: \"kubernetes.io/projected/4a56c774-0c51-4378-bfce-81bb5481f736-kube-api-access-qhn57\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g64ph\" (UID: \"4a56c774-0c51-4378-bfce-81bb5481f736\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.386506 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" Nov 22 03:29:53 crc kubenswrapper[4952]: I1122 03:29:53.965947 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph"] Nov 22 03:29:54 crc kubenswrapper[4952]: I1122 03:29:54.278373 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nmjsp"] Nov 22 03:29:54 crc kubenswrapper[4952]: I1122 03:29:54.285591 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nmjsp" Nov 22 03:29:54 crc kubenswrapper[4952]: I1122 03:29:54.291086 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nmjsp"] Nov 22 03:29:54 crc kubenswrapper[4952]: I1122 03:29:54.346598 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1943d964-1044-4d88-bf48-51415d4af222-utilities\") pod \"redhat-operators-nmjsp\" (UID: \"1943d964-1044-4d88-bf48-51415d4af222\") " pod="openshift-marketplace/redhat-operators-nmjsp" Nov 22 03:29:54 crc kubenswrapper[4952]: I1122 03:29:54.346687 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd4t7\" (UniqueName: \"kubernetes.io/projected/1943d964-1044-4d88-bf48-51415d4af222-kube-api-access-xd4t7\") pod \"redhat-operators-nmjsp\" (UID: \"1943d964-1044-4d88-bf48-51415d4af222\") " pod="openshift-marketplace/redhat-operators-nmjsp" Nov 22 03:29:54 crc kubenswrapper[4952]: I1122 03:29:54.346742 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1943d964-1044-4d88-bf48-51415d4af222-catalog-content\") pod \"redhat-operators-nmjsp\" (UID: \"1943d964-1044-4d88-bf48-51415d4af222\") " pod="openshift-marketplace/redhat-operators-nmjsp" Nov 22 03:29:54 crc kubenswrapper[4952]: I1122 03:29:54.448376 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1943d964-1044-4d88-bf48-51415d4af222-catalog-content\") pod \"redhat-operators-nmjsp\" (UID: \"1943d964-1044-4d88-bf48-51415d4af222\") " pod="openshift-marketplace/redhat-operators-nmjsp" Nov 22 03:29:54 crc kubenswrapper[4952]: I1122 03:29:54.448524 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1943d964-1044-4d88-bf48-51415d4af222-utilities\") pod \"redhat-operators-nmjsp\" (UID: \"1943d964-1044-4d88-bf48-51415d4af222\") " pod="openshift-marketplace/redhat-operators-nmjsp" Nov 22 03:29:54 crc kubenswrapper[4952]: I1122 03:29:54.448631 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd4t7\" (UniqueName: \"kubernetes.io/projected/1943d964-1044-4d88-bf48-51415d4af222-kube-api-access-xd4t7\") pod \"redhat-operators-nmjsp\" (UID: \"1943d964-1044-4d88-bf48-51415d4af222\") " pod="openshift-marketplace/redhat-operators-nmjsp" Nov 22 03:29:54 crc kubenswrapper[4952]: I1122 03:29:54.448925 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1943d964-1044-4d88-bf48-51415d4af222-catalog-content\") pod \"redhat-operators-nmjsp\" (UID: \"1943d964-1044-4d88-bf48-51415d4af222\") " pod="openshift-marketplace/redhat-operators-nmjsp" Nov 22 03:29:54 crc kubenswrapper[4952]: I1122 03:29:54.449181 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1943d964-1044-4d88-bf48-51415d4af222-utilities\") pod \"redhat-operators-nmjsp\" (UID: \"1943d964-1044-4d88-bf48-51415d4af222\") " pod="openshift-marketplace/redhat-operators-nmjsp" Nov 22 03:29:54 crc kubenswrapper[4952]: I1122 03:29:54.475377 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd4t7\" (UniqueName: \"kubernetes.io/projected/1943d964-1044-4d88-bf48-51415d4af222-kube-api-access-xd4t7\") pod \"redhat-operators-nmjsp\" (UID: \"1943d964-1044-4d88-bf48-51415d4af222\") " pod="openshift-marketplace/redhat-operators-nmjsp" Nov 22 03:29:54 crc kubenswrapper[4952]: I1122 03:29:54.646057 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nmjsp" Nov 22 03:29:54 crc kubenswrapper[4952]: I1122 03:29:54.951139 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" event={"ID":"4a56c774-0c51-4378-bfce-81bb5481f736","Type":"ContainerStarted","Data":"ff4febf369e2c1fdba3e99ff10812721bb90244d7abd09f5bebf9c39f82f8bcf"} Nov 22 03:29:54 crc kubenswrapper[4952]: I1122 03:29:54.951622 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" event={"ID":"4a56c774-0c51-4378-bfce-81bb5481f736","Type":"ContainerStarted","Data":"fcd978d9db5b1dd4e26d0f8cf444a05501cab32539e8b30aa365fcf150847fdd"} Nov 22 03:29:54 crc kubenswrapper[4952]: I1122 03:29:54.981191 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" podStartSLOduration=1.496846769 podStartE2EDuration="1.981159197s" podCreationTimestamp="2025-11-22 03:29:53 +0000 UTC" firstStartedPulling="2025-11-22 03:29:53.980591863 +0000 UTC m=+2158.286609136" lastFinishedPulling="2025-11-22 03:29:54.464904281 +0000 UTC m=+2158.770921564" observedRunningTime="2025-11-22 03:29:54.972015554 +0000 UTC m=+2159.278032827" watchObservedRunningTime="2025-11-22 03:29:54.981159197 +0000 UTC m=+2159.287176470" Nov 22 03:29:55 crc kubenswrapper[4952]: I1122 03:29:55.131298 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nmjsp"] Nov 22 03:29:55 crc kubenswrapper[4952]: W1122 03:29:55.137675 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1943d964_1044_4d88_bf48_51415d4af222.slice/crio-767f85913024915a86dc8d38f027fb194c7a18d79f66e6adc5dd98b826d88eb5 WatchSource:0}: Error finding container 767f85913024915a86dc8d38f027fb194c7a18d79f66e6adc5dd98b826d88eb5: Status 404 returned error can't find the container with id 767f85913024915a86dc8d38f027fb194c7a18d79f66e6adc5dd98b826d88eb5 Nov 22 03:29:55 crc kubenswrapper[4952]: I1122 03:29:55.963978 4952 generic.go:334] "Generic (PLEG): container finished" podID="1943d964-1044-4d88-bf48-51415d4af222" containerID="a8b69f0fa9d8ec76dfc8ca868fe1f2b86ba33f6a22977a3370f36aeb38575ba8" exitCode=0 Nov 22 03:29:55 crc kubenswrapper[4952]: I1122 03:29:55.964164 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmjsp" event={"ID":"1943d964-1044-4d88-bf48-51415d4af222","Type":"ContainerDied","Data":"a8b69f0fa9d8ec76dfc8ca868fe1f2b86ba33f6a22977a3370f36aeb38575ba8"} Nov 22 03:29:55 crc kubenswrapper[4952]: I1122 03:29:55.964802 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmjsp" event={"ID":"1943d964-1044-4d88-bf48-51415d4af222","Type":"ContainerStarted","Data":"767f85913024915a86dc8d38f027fb194c7a18d79f66e6adc5dd98b826d88eb5"} Nov 22 03:29:56 crc kubenswrapper[4952]: I1122 03:29:56.984147 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmjsp" event={"ID":"1943d964-1044-4d88-bf48-51415d4af222","Type":"ContainerStarted","Data":"d5f2c07c8b082521f9734b7b41002dfb64aae7524440635438f9dc92341e00b5"} Nov 22 03:29:57 crc kubenswrapper[4952]: I1122 03:29:57.997803 4952 generic.go:334] "Generic (PLEG): container finished" podID="1943d964-1044-4d88-bf48-51415d4af222" containerID="d5f2c07c8b082521f9734b7b41002dfb64aae7524440635438f9dc92341e00b5" exitCode=0 Nov 22 03:29:57 crc kubenswrapper[4952]: I1122 03:29:57.997859 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmjsp" event={"ID":"1943d964-1044-4d88-bf48-51415d4af222","Type":"ContainerDied","Data":"d5f2c07c8b082521f9734b7b41002dfb64aae7524440635438f9dc92341e00b5"} Nov 22 03:29:58 crc kubenswrapper[4952]: I1122 03:29:58.342318 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:29:58 crc kubenswrapper[4952]: I1122 03:29:58.342395 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:29:59 crc kubenswrapper[4952]: I1122 03:29:59.009571 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmjsp" event={"ID":"1943d964-1044-4d88-bf48-51415d4af222","Type":"ContainerStarted","Data":"e746855ebedfa97fba1a49db0132d20c9c18daaa2e113979019eedbda8f14300"} Nov 22 03:29:59 crc kubenswrapper[4952]: I1122 03:29:59.050609 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nmjsp" podStartSLOduration=2.589263457 podStartE2EDuration="5.050585231s" podCreationTimestamp="2025-11-22 03:29:54 +0000 UTC" firstStartedPulling="2025-11-22 03:29:55.966390014 +0000 UTC m=+2160.272407297" lastFinishedPulling="2025-11-22 03:29:58.427711798 +0000 UTC m=+2162.733729071" observedRunningTime="2025-11-22 03:29:59.035600695 +0000 UTC m=+2163.341617978" watchObservedRunningTime="2025-11-22 03:29:59.050585231 +0000 UTC m=+2163.356602514" Nov 22 03:30:00 crc kubenswrapper[4952]: I1122 03:30:00.146056 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k"] Nov 22 03:30:00 crc kubenswrapper[4952]: I1122 03:30:00.148335 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k" Nov 22 03:30:00 crc kubenswrapper[4952]: I1122 03:30:00.151171 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 03:30:00 crc kubenswrapper[4952]: I1122 03:30:00.152222 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 03:30:00 crc kubenswrapper[4952]: I1122 03:30:00.163224 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k"] Nov 22 03:30:00 crc kubenswrapper[4952]: I1122 03:30:00.272067 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf9f9c23-32be-47e9-85fa-91ed3572291e-secret-volume\") pod \"collect-profiles-29396370-pvz2k\" (UID: \"bf9f9c23-32be-47e9-85fa-91ed3572291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k" Nov 22 03:30:00 crc kubenswrapper[4952]: I1122 03:30:00.272155 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf9f9c23-32be-47e9-85fa-91ed3572291e-config-volume\") pod \"collect-profiles-29396370-pvz2k\" (UID: \"bf9f9c23-32be-47e9-85fa-91ed3572291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k" Nov 22 03:30:00 crc kubenswrapper[4952]: I1122 03:30:00.272181 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj592\" (UniqueName: \"kubernetes.io/projected/bf9f9c23-32be-47e9-85fa-91ed3572291e-kube-api-access-jj592\") pod \"collect-profiles-29396370-pvz2k\" (UID: \"bf9f9c23-32be-47e9-85fa-91ed3572291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k" Nov 22 03:30:00 crc kubenswrapper[4952]: I1122 03:30:00.373866 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf9f9c23-32be-47e9-85fa-91ed3572291e-secret-volume\") pod \"collect-profiles-29396370-pvz2k\" (UID: \"bf9f9c23-32be-47e9-85fa-91ed3572291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k" Nov 22 03:30:00 crc kubenswrapper[4952]: I1122 03:30:00.374267 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf9f9c23-32be-47e9-85fa-91ed3572291e-config-volume\") pod \"collect-profiles-29396370-pvz2k\" (UID: \"bf9f9c23-32be-47e9-85fa-91ed3572291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k" Nov 22 03:30:00 crc kubenswrapper[4952]: I1122 03:30:00.374588 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj592\" (UniqueName: \"kubernetes.io/projected/bf9f9c23-32be-47e9-85fa-91ed3572291e-kube-api-access-jj592\") pod \"collect-profiles-29396370-pvz2k\" (UID: \"bf9f9c23-32be-47e9-85fa-91ed3572291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k" Nov 22 03:30:00 crc kubenswrapper[4952]: I1122 03:30:00.375478 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf9f9c23-32be-47e9-85fa-91ed3572291e-config-volume\") pod \"collect-profiles-29396370-pvz2k\" (UID: \"bf9f9c23-32be-47e9-85fa-91ed3572291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k" Nov 22 03:30:00 crc kubenswrapper[4952]: I1122 03:30:00.385595 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf9f9c23-32be-47e9-85fa-91ed3572291e-secret-volume\") pod \"collect-profiles-29396370-pvz2k\" (UID: \"bf9f9c23-32be-47e9-85fa-91ed3572291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k" Nov 22 03:30:00 crc kubenswrapper[4952]: I1122 03:30:00.395167 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj592\" (UniqueName: \"kubernetes.io/projected/bf9f9c23-32be-47e9-85fa-91ed3572291e-kube-api-access-jj592\") pod \"collect-profiles-29396370-pvz2k\" (UID: \"bf9f9c23-32be-47e9-85fa-91ed3572291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k" Nov 22 03:30:00 crc kubenswrapper[4952]: I1122 03:30:00.531414 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k" Nov 22 03:30:01 crc kubenswrapper[4952]: I1122 03:30:01.020161 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k"] Nov 22 03:30:01 crc kubenswrapper[4952]: I1122 03:30:01.051133 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k" event={"ID":"bf9f9c23-32be-47e9-85fa-91ed3572291e","Type":"ContainerStarted","Data":"ac43a7b38f0337c763166cbfa2b43a99c42fa73c0139181200a7fbde3a0f0941"} Nov 22 03:30:02 crc kubenswrapper[4952]: I1122 03:30:02.073115 4952 generic.go:334] "Generic (PLEG): container finished" podID="bf9f9c23-32be-47e9-85fa-91ed3572291e" containerID="91f211b25270cea94909e052ea7f67f868614318f5dbbffa3ddf0ec178eec2e6" exitCode=0 Nov 22 03:30:02 crc kubenswrapper[4952]: I1122 03:30:02.073418 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k" event={"ID":"bf9f9c23-32be-47e9-85fa-91ed3572291e","Type":"ContainerDied","Data":"91f211b25270cea94909e052ea7f67f868614318f5dbbffa3ddf0ec178eec2e6"} Nov 22 03:30:03 crc kubenswrapper[4952]: I1122 03:30:03.413994 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k" Nov 22 03:30:03 crc kubenswrapper[4952]: I1122 03:30:03.543134 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj592\" (UniqueName: \"kubernetes.io/projected/bf9f9c23-32be-47e9-85fa-91ed3572291e-kube-api-access-jj592\") pod \"bf9f9c23-32be-47e9-85fa-91ed3572291e\" (UID: \"bf9f9c23-32be-47e9-85fa-91ed3572291e\") " Nov 22 03:30:03 crc kubenswrapper[4952]: I1122 03:30:03.543288 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf9f9c23-32be-47e9-85fa-91ed3572291e-config-volume\") pod \"bf9f9c23-32be-47e9-85fa-91ed3572291e\" (UID: \"bf9f9c23-32be-47e9-85fa-91ed3572291e\") " Nov 22 03:30:03 crc kubenswrapper[4952]: I1122 03:30:03.543330 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf9f9c23-32be-47e9-85fa-91ed3572291e-secret-volume\") pod \"bf9f9c23-32be-47e9-85fa-91ed3572291e\" (UID: \"bf9f9c23-32be-47e9-85fa-91ed3572291e\") " Nov 22 03:30:03 crc kubenswrapper[4952]: I1122 03:30:03.544070 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf9f9c23-32be-47e9-85fa-91ed3572291e-config-volume" (OuterVolumeSpecName: "config-volume") pod "bf9f9c23-32be-47e9-85fa-91ed3572291e" (UID: "bf9f9c23-32be-47e9-85fa-91ed3572291e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:30:03 crc kubenswrapper[4952]: I1122 03:30:03.558792 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf9f9c23-32be-47e9-85fa-91ed3572291e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bf9f9c23-32be-47e9-85fa-91ed3572291e" (UID: "bf9f9c23-32be-47e9-85fa-91ed3572291e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:30:03 crc kubenswrapper[4952]: I1122 03:30:03.558903 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf9f9c23-32be-47e9-85fa-91ed3572291e-kube-api-access-jj592" (OuterVolumeSpecName: "kube-api-access-jj592") pod "bf9f9c23-32be-47e9-85fa-91ed3572291e" (UID: "bf9f9c23-32be-47e9-85fa-91ed3572291e"). InnerVolumeSpecName "kube-api-access-jj592". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:30:03 crc kubenswrapper[4952]: I1122 03:30:03.646338 4952 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf9f9c23-32be-47e9-85fa-91ed3572291e-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:03 crc kubenswrapper[4952]: I1122 03:30:03.646391 4952 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf9f9c23-32be-47e9-85fa-91ed3572291e-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:03 crc kubenswrapper[4952]: I1122 03:30:03.646404 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj592\" (UniqueName: \"kubernetes.io/projected/bf9f9c23-32be-47e9-85fa-91ed3572291e-kube-api-access-jj592\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:04 crc kubenswrapper[4952]: I1122 03:30:04.094466 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k" event={"ID":"bf9f9c23-32be-47e9-85fa-91ed3572291e","Type":"ContainerDied","Data":"ac43a7b38f0337c763166cbfa2b43a99c42fa73c0139181200a7fbde3a0f0941"} Nov 22 03:30:04 crc kubenswrapper[4952]: I1122 03:30:04.094777 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac43a7b38f0337c763166cbfa2b43a99c42fa73c0139181200a7fbde3a0f0941" Nov 22 03:30:04 crc kubenswrapper[4952]: I1122 03:30:04.094862 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k" Nov 22 03:30:04 crc kubenswrapper[4952]: E1122 03:30:04.137033 4952 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf9f9c23_32be_47e9_85fa_91ed3572291e.slice\": RecentStats: unable to find data in memory cache]" Nov 22 03:30:04 crc kubenswrapper[4952]: I1122 03:30:04.484852 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq"] Nov 22 03:30:04 crc kubenswrapper[4952]: I1122 03:30:04.492941 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396325-swzgq"] Nov 22 03:30:04 crc kubenswrapper[4952]: I1122 03:30:04.889727 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb" path="/var/lib/kubelet/pods/ce9f6cd6-d48b-4b94-b78b-ebc08f07dbdb/volumes" Nov 22 03:30:04 crc kubenswrapper[4952]: I1122 03:30:04.891640 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nmjsp" Nov 22 03:30:04 crc kubenswrapper[4952]: I1122 03:30:04.891684 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nmjsp" Nov 22 03:30:05 crc kubenswrapper[4952]: I1122 03:30:05.710407 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nmjsp" podUID="1943d964-1044-4d88-bf48-51415d4af222" containerName="registry-server" probeResult="failure" output=< Nov 22 03:30:05 crc kubenswrapper[4952]: timeout: failed to connect service ":50051" within 1s Nov 22 03:30:05 crc kubenswrapper[4952]: > Nov 22 03:30:14 crc kubenswrapper[4952]: I1122 03:30:14.692776 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nmjsp" Nov 22 03:30:14 crc kubenswrapper[4952]: I1122 03:30:14.748855 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nmjsp" Nov 22 03:30:14 crc kubenswrapper[4952]: I1122 03:30:14.932956 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nmjsp"] Nov 22 03:30:16 crc kubenswrapper[4952]: I1122 03:30:16.214595 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nmjsp" podUID="1943d964-1044-4d88-bf48-51415d4af222" containerName="registry-server" containerID="cri-o://e746855ebedfa97fba1a49db0132d20c9c18daaa2e113979019eedbda8f14300" gracePeriod=2 Nov 22 03:30:16 crc kubenswrapper[4952]: I1122 03:30:16.678963 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nmjsp" Nov 22 03:30:16 crc kubenswrapper[4952]: I1122 03:30:16.706645 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd4t7\" (UniqueName: \"kubernetes.io/projected/1943d964-1044-4d88-bf48-51415d4af222-kube-api-access-xd4t7\") pod \"1943d964-1044-4d88-bf48-51415d4af222\" (UID: \"1943d964-1044-4d88-bf48-51415d4af222\") " Nov 22 03:30:16 crc kubenswrapper[4952]: I1122 03:30:16.706763 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1943d964-1044-4d88-bf48-51415d4af222-utilities\") pod \"1943d964-1044-4d88-bf48-51415d4af222\" (UID: \"1943d964-1044-4d88-bf48-51415d4af222\") " Nov 22 03:30:16 crc kubenswrapper[4952]: I1122 03:30:16.706847 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1943d964-1044-4d88-bf48-51415d4af222-catalog-content\") pod \"1943d964-1044-4d88-bf48-51415d4af222\" (UID: \"1943d964-1044-4d88-bf48-51415d4af222\") " Nov 22 03:30:16 crc kubenswrapper[4952]: I1122 03:30:16.707474 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1943d964-1044-4d88-bf48-51415d4af222-utilities" (OuterVolumeSpecName: "utilities") pod "1943d964-1044-4d88-bf48-51415d4af222" (UID: "1943d964-1044-4d88-bf48-51415d4af222"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:30:16 crc kubenswrapper[4952]: I1122 03:30:16.735047 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1943d964-1044-4d88-bf48-51415d4af222-kube-api-access-xd4t7" (OuterVolumeSpecName: "kube-api-access-xd4t7") pod "1943d964-1044-4d88-bf48-51415d4af222" (UID: "1943d964-1044-4d88-bf48-51415d4af222"). InnerVolumeSpecName "kube-api-access-xd4t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:30:16 crc kubenswrapper[4952]: I1122 03:30:16.807891 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1943d964-1044-4d88-bf48-51415d4af222-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1943d964-1044-4d88-bf48-51415d4af222" (UID: "1943d964-1044-4d88-bf48-51415d4af222"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:30:16 crc kubenswrapper[4952]: I1122 03:30:16.808469 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1943d964-1044-4d88-bf48-51415d4af222-catalog-content\") pod \"1943d964-1044-4d88-bf48-51415d4af222\" (UID: \"1943d964-1044-4d88-bf48-51415d4af222\") " Nov 22 03:30:16 crc kubenswrapper[4952]: I1122 03:30:16.808967 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd4t7\" (UniqueName: \"kubernetes.io/projected/1943d964-1044-4d88-bf48-51415d4af222-kube-api-access-xd4t7\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:16 crc kubenswrapper[4952]: I1122 03:30:16.808997 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1943d964-1044-4d88-bf48-51415d4af222-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:16 crc kubenswrapper[4952]: W1122 03:30:16.809090 4952 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/1943d964-1044-4d88-bf48-51415d4af222/volumes/kubernetes.io~empty-dir/catalog-content Nov 22 03:30:16 crc kubenswrapper[4952]: I1122 03:30:16.809112 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1943d964-1044-4d88-bf48-51415d4af222-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1943d964-1044-4d88-bf48-51415d4af222" (UID: "1943d964-1044-4d88-bf48-51415d4af222"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:30:16 crc kubenswrapper[4952]: I1122 03:30:16.911380 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1943d964-1044-4d88-bf48-51415d4af222-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:17 crc kubenswrapper[4952]: I1122 03:30:17.226509 4952 generic.go:334] "Generic (PLEG): container finished" podID="1943d964-1044-4d88-bf48-51415d4af222" containerID="e746855ebedfa97fba1a49db0132d20c9c18daaa2e113979019eedbda8f14300" exitCode=0 Nov 22 03:30:17 crc kubenswrapper[4952]: I1122 03:30:17.226601 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nmjsp" Nov 22 03:30:17 crc kubenswrapper[4952]: I1122 03:30:17.226594 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmjsp" event={"ID":"1943d964-1044-4d88-bf48-51415d4af222","Type":"ContainerDied","Data":"e746855ebedfa97fba1a49db0132d20c9c18daaa2e113979019eedbda8f14300"} Nov 22 03:30:17 crc kubenswrapper[4952]: I1122 03:30:17.226682 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmjsp" event={"ID":"1943d964-1044-4d88-bf48-51415d4af222","Type":"ContainerDied","Data":"767f85913024915a86dc8d38f027fb194c7a18d79f66e6adc5dd98b826d88eb5"} Nov 22 03:30:17 crc kubenswrapper[4952]: I1122 03:30:17.226716 4952 scope.go:117] "RemoveContainer" containerID="e746855ebedfa97fba1a49db0132d20c9c18daaa2e113979019eedbda8f14300" Nov 22 03:30:17 crc kubenswrapper[4952]: I1122 03:30:17.252607 4952 scope.go:117] "RemoveContainer" containerID="d5f2c07c8b082521f9734b7b41002dfb64aae7524440635438f9dc92341e00b5" Nov 22 03:30:17 crc kubenswrapper[4952]: I1122 03:30:17.268590 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nmjsp"] Nov 22 03:30:17 crc kubenswrapper[4952]: I1122 03:30:17.275979 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nmjsp"] Nov 22 03:30:17 crc kubenswrapper[4952]: I1122 03:30:17.285439 4952 scope.go:117] "RemoveContainer" containerID="a8b69f0fa9d8ec76dfc8ca868fe1f2b86ba33f6a22977a3370f36aeb38575ba8" Nov 22 03:30:17 crc kubenswrapper[4952]: I1122 03:30:17.314412 4952 scope.go:117] "RemoveContainer" containerID="e746855ebedfa97fba1a49db0132d20c9c18daaa2e113979019eedbda8f14300" Nov 22 03:30:17 crc kubenswrapper[4952]: E1122 03:30:17.314901 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e746855ebedfa97fba1a49db0132d20c9c18daaa2e113979019eedbda8f14300\": container with ID starting with e746855ebedfa97fba1a49db0132d20c9c18daaa2e113979019eedbda8f14300 not found: ID does not exist" containerID="e746855ebedfa97fba1a49db0132d20c9c18daaa2e113979019eedbda8f14300" Nov 22 03:30:17 crc kubenswrapper[4952]: I1122 03:30:17.314961 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e746855ebedfa97fba1a49db0132d20c9c18daaa2e113979019eedbda8f14300"} err="failed to get container status \"e746855ebedfa97fba1a49db0132d20c9c18daaa2e113979019eedbda8f14300\": rpc error: code = NotFound desc = could not find container \"e746855ebedfa97fba1a49db0132d20c9c18daaa2e113979019eedbda8f14300\": container with ID starting with e746855ebedfa97fba1a49db0132d20c9c18daaa2e113979019eedbda8f14300 not found: ID does not exist" Nov 22 03:30:17 crc kubenswrapper[4952]: I1122 03:30:17.314995 4952 scope.go:117] "RemoveContainer" containerID="d5f2c07c8b082521f9734b7b41002dfb64aae7524440635438f9dc92341e00b5" Nov 22 03:30:17 crc kubenswrapper[4952]: E1122 03:30:17.315381 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5f2c07c8b082521f9734b7b41002dfb64aae7524440635438f9dc92341e00b5\": container with ID starting with d5f2c07c8b082521f9734b7b41002dfb64aae7524440635438f9dc92341e00b5 not found: ID does not exist" containerID="d5f2c07c8b082521f9734b7b41002dfb64aae7524440635438f9dc92341e00b5" Nov 22 03:30:17 crc kubenswrapper[4952]: I1122 03:30:17.315442 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5f2c07c8b082521f9734b7b41002dfb64aae7524440635438f9dc92341e00b5"} err="failed to get container status \"d5f2c07c8b082521f9734b7b41002dfb64aae7524440635438f9dc92341e00b5\": rpc error: code = NotFound desc = could not find container \"d5f2c07c8b082521f9734b7b41002dfb64aae7524440635438f9dc92341e00b5\": container with ID starting with d5f2c07c8b082521f9734b7b41002dfb64aae7524440635438f9dc92341e00b5 not found: ID does not exist" Nov 22 03:30:17 crc kubenswrapper[4952]: I1122 03:30:17.315474 4952 scope.go:117] "RemoveContainer" containerID="a8b69f0fa9d8ec76dfc8ca868fe1f2b86ba33f6a22977a3370f36aeb38575ba8" Nov 22 03:30:17 crc kubenswrapper[4952]: E1122 03:30:17.315776 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8b69f0fa9d8ec76dfc8ca868fe1f2b86ba33f6a22977a3370f36aeb38575ba8\": container with ID starting with a8b69f0fa9d8ec76dfc8ca868fe1f2b86ba33f6a22977a3370f36aeb38575ba8 not found: ID does not exist" containerID="a8b69f0fa9d8ec76dfc8ca868fe1f2b86ba33f6a22977a3370f36aeb38575ba8" Nov 22 03:30:17 crc kubenswrapper[4952]: I1122 03:30:17.315834 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8b69f0fa9d8ec76dfc8ca868fe1f2b86ba33f6a22977a3370f36aeb38575ba8"} err="failed to get container status \"a8b69f0fa9d8ec76dfc8ca868fe1f2b86ba33f6a22977a3370f36aeb38575ba8\": rpc error: code = NotFound desc = could not find container \"a8b69f0fa9d8ec76dfc8ca868fe1f2b86ba33f6a22977a3370f36aeb38575ba8\": container with ID starting with a8b69f0fa9d8ec76dfc8ca868fe1f2b86ba33f6a22977a3370f36aeb38575ba8 not found: ID does not exist" Nov 22 03:30:18 crc kubenswrapper[4952]: I1122 03:30:18.542707 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1943d964-1044-4d88-bf48-51415d4af222" path="/var/lib/kubelet/pods/1943d964-1044-4d88-bf48-51415d4af222/volumes" Nov 22 03:30:21 crc kubenswrapper[4952]: I1122 03:30:21.264946 4952 generic.go:334] "Generic (PLEG): container finished" podID="4a56c774-0c51-4378-bfce-81bb5481f736" containerID="ff4febf369e2c1fdba3e99ff10812721bb90244d7abd09f5bebf9c39f82f8bcf" exitCode=0 Nov 22 03:30:21 crc kubenswrapper[4952]: I1122 03:30:21.265036 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" event={"ID":"4a56c774-0c51-4378-bfce-81bb5481f736","Type":"ContainerDied","Data":"ff4febf369e2c1fdba3e99ff10812721bb90244d7abd09f5bebf9c39f82f8bcf"} Nov 22 03:30:22 crc kubenswrapper[4952]: I1122 03:30:22.666126 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" Nov 22 03:30:22 crc kubenswrapper[4952]: I1122 03:30:22.824017 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4a56c774-0c51-4378-bfce-81bb5481f736-ceph\") pod \"4a56c774-0c51-4378-bfce-81bb5481f736\" (UID: \"4a56c774-0c51-4378-bfce-81bb5481f736\") " Nov 22 03:30:22 crc kubenswrapper[4952]: I1122 03:30:22.824211 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a56c774-0c51-4378-bfce-81bb5481f736-inventory\") pod \"4a56c774-0c51-4378-bfce-81bb5481f736\" (UID: \"4a56c774-0c51-4378-bfce-81bb5481f736\") " Nov 22 03:30:22 crc kubenswrapper[4952]: I1122 03:30:22.824272 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhn57\" (UniqueName: \"kubernetes.io/projected/4a56c774-0c51-4378-bfce-81bb5481f736-kube-api-access-qhn57\") pod \"4a56c774-0c51-4378-bfce-81bb5481f736\" (UID: \"4a56c774-0c51-4378-bfce-81bb5481f736\") " Nov 22 03:30:22 crc kubenswrapper[4952]: I1122 03:30:22.824368 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4a56c774-0c51-4378-bfce-81bb5481f736-ssh-key\") pod \"4a56c774-0c51-4378-bfce-81bb5481f736\" (UID: \"4a56c774-0c51-4378-bfce-81bb5481f736\") " Nov 22 03:30:22 crc kubenswrapper[4952]: I1122 03:30:22.830622 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a56c774-0c51-4378-bfce-81bb5481f736-kube-api-access-qhn57" (OuterVolumeSpecName: "kube-api-access-qhn57") pod "4a56c774-0c51-4378-bfce-81bb5481f736" (UID: "4a56c774-0c51-4378-bfce-81bb5481f736"). InnerVolumeSpecName "kube-api-access-qhn57". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:30:22 crc kubenswrapper[4952]: I1122 03:30:22.830815 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a56c774-0c51-4378-bfce-81bb5481f736-ceph" (OuterVolumeSpecName: "ceph") pod "4a56c774-0c51-4378-bfce-81bb5481f736" (UID: "4a56c774-0c51-4378-bfce-81bb5481f736"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:30:22 crc kubenswrapper[4952]: I1122 03:30:22.852772 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a56c774-0c51-4378-bfce-81bb5481f736-inventory" (OuterVolumeSpecName: "inventory") pod "4a56c774-0c51-4378-bfce-81bb5481f736" (UID: "4a56c774-0c51-4378-bfce-81bb5481f736"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:30:22 crc kubenswrapper[4952]: I1122 03:30:22.868118 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a56c774-0c51-4378-bfce-81bb5481f736-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4a56c774-0c51-4378-bfce-81bb5481f736" (UID: "4a56c774-0c51-4378-bfce-81bb5481f736"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:30:22 crc kubenswrapper[4952]: I1122 03:30:22.928182 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4a56c774-0c51-4378-bfce-81bb5481f736-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:22 crc kubenswrapper[4952]: I1122 03:30:22.928232 4952 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4a56c774-0c51-4378-bfce-81bb5481f736-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:22 crc kubenswrapper[4952]: I1122 03:30:22.928248 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a56c774-0c51-4378-bfce-81bb5481f736-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:22 crc kubenswrapper[4952]: I1122 03:30:22.928290 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhn57\" (UniqueName: \"kubernetes.io/projected/4a56c774-0c51-4378-bfce-81bb5481f736-kube-api-access-qhn57\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.299236 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" event={"ID":"4a56c774-0c51-4378-bfce-81bb5481f736","Type":"ContainerDied","Data":"fcd978d9db5b1dd4e26d0f8cf444a05501cab32539e8b30aa365fcf150847fdd"} Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.299310 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcd978d9db5b1dd4e26d0f8cf444a05501cab32539e8b30aa365fcf150847fdd" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.299532 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g64ph" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.382494 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v"] Nov 22 03:30:23 crc kubenswrapper[4952]: E1122 03:30:23.383001 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1943d964-1044-4d88-bf48-51415d4af222" containerName="extract-content" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.383023 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="1943d964-1044-4d88-bf48-51415d4af222" containerName="extract-content" Nov 22 03:30:23 crc kubenswrapper[4952]: E1122 03:30:23.383034 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf9f9c23-32be-47e9-85fa-91ed3572291e" containerName="collect-profiles" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.383041 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf9f9c23-32be-47e9-85fa-91ed3572291e" containerName="collect-profiles" Nov 22 03:30:23 crc kubenswrapper[4952]: E1122 03:30:23.383069 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a56c774-0c51-4378-bfce-81bb5481f736" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.383078 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a56c774-0c51-4378-bfce-81bb5481f736" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:30:23 crc kubenswrapper[4952]: E1122 03:30:23.383094 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1943d964-1044-4d88-bf48-51415d4af222" containerName="extract-utilities" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.383102 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="1943d964-1044-4d88-bf48-51415d4af222" containerName="extract-utilities" Nov 22 03:30:23 crc kubenswrapper[4952]: E1122 03:30:23.383124 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1943d964-1044-4d88-bf48-51415d4af222" containerName="registry-server" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.383132 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="1943d964-1044-4d88-bf48-51415d4af222" containerName="registry-server" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.383327 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="1943d964-1044-4d88-bf48-51415d4af222" containerName="registry-server" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.383341 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a56c774-0c51-4378-bfce-81bb5481f736" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.383356 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf9f9c23-32be-47e9-85fa-91ed3572291e" containerName="collect-profiles" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.384028 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.387385 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.387599 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.387711 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.388207 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.389062 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.392119 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v"] Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.537449 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89b44b40-f505-48de-ada1-e476204fd059-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v\" (UID: \"89b44b40-f505-48de-ada1-e476204fd059\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.537798 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9x82\" (UniqueName: \"kubernetes.io/projected/89b44b40-f505-48de-ada1-e476204fd059-kube-api-access-d9x82\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v\" (UID: \"89b44b40-f505-48de-ada1-e476204fd059\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.537848 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/89b44b40-f505-48de-ada1-e476204fd059-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v\" (UID: \"89b44b40-f505-48de-ada1-e476204fd059\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.537903 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89b44b40-f505-48de-ada1-e476204fd059-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v\" (UID: \"89b44b40-f505-48de-ada1-e476204fd059\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.639600 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/89b44b40-f505-48de-ada1-e476204fd059-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v\" (UID: \"89b44b40-f505-48de-ada1-e476204fd059\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.639681 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89b44b40-f505-48de-ada1-e476204fd059-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v\" (UID: \"89b44b40-f505-48de-ada1-e476204fd059\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.639723 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89b44b40-f505-48de-ada1-e476204fd059-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v\" (UID: \"89b44b40-f505-48de-ada1-e476204fd059\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.639782 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9x82\" (UniqueName: \"kubernetes.io/projected/89b44b40-f505-48de-ada1-e476204fd059-kube-api-access-d9x82\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v\" (UID: \"89b44b40-f505-48de-ada1-e476204fd059\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.651631 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89b44b40-f505-48de-ada1-e476204fd059-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v\" (UID: \"89b44b40-f505-48de-ada1-e476204fd059\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.652833 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89b44b40-f505-48de-ada1-e476204fd059-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v\" (UID: \"89b44b40-f505-48de-ada1-e476204fd059\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.653181 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/89b44b40-f505-48de-ada1-e476204fd059-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v\" (UID: \"89b44b40-f505-48de-ada1-e476204fd059\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.661338 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9x82\" (UniqueName: \"kubernetes.io/projected/89b44b40-f505-48de-ada1-e476204fd059-kube-api-access-d9x82\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v\" (UID: \"89b44b40-f505-48de-ada1-e476204fd059\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" Nov 22 03:30:23 crc kubenswrapper[4952]: I1122 03:30:23.699880 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" Nov 22 03:30:24 crc kubenswrapper[4952]: I1122 03:30:24.414794 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v"] Nov 22 03:30:25 crc kubenswrapper[4952]: I1122 03:30:25.321298 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" event={"ID":"89b44b40-f505-48de-ada1-e476204fd059","Type":"ContainerStarted","Data":"ca81021b1ee170ed5ad570231022a77ac092e5424b483634f5c1cd27178a2948"} Nov 22 03:30:25 crc kubenswrapper[4952]: I1122 03:30:25.321949 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" event={"ID":"89b44b40-f505-48de-ada1-e476204fd059","Type":"ContainerStarted","Data":"902da8aa8fcd6ddbf62b4cf138d214842e05d75569153670f87c06a8405aae59"} Nov 22 03:30:28 crc kubenswrapper[4952]: I1122 03:30:28.341672 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:30:28 crc kubenswrapper[4952]: I1122 03:30:28.341993 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:30:28 crc kubenswrapper[4952]: I1122 03:30:28.342056 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 03:30:28 crc kubenswrapper[4952]: I1122 03:30:28.343184 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b"} pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:30:28 crc kubenswrapper[4952]: I1122 03:30:28.343299 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" containerID="cri-o://25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" gracePeriod=600 Nov 22 03:30:28 crc kubenswrapper[4952]: E1122 03:30:28.986006 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:30:29 crc kubenswrapper[4952]: I1122 03:30:29.383119 4952 generic.go:334] "Generic (PLEG): container finished" podID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" exitCode=0 Nov 22 03:30:29 crc kubenswrapper[4952]: I1122 03:30:29.383382 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerDied","Data":"25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b"} Nov 22 03:30:29 crc kubenswrapper[4952]: I1122 03:30:29.383500 4952 scope.go:117] "RemoveContainer" containerID="91d224567de3e8c5e7b35b6e9663db90e384faa49fc63bd2dfda4142a43c0df8" Nov 22 03:30:29 crc kubenswrapper[4952]: I1122 03:30:29.384393 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:30:29 crc kubenswrapper[4952]: E1122 03:30:29.384950 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:30:29 crc kubenswrapper[4952]: I1122 03:30:29.407174 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" podStartSLOduration=5.968119511 podStartE2EDuration="6.407147159s" podCreationTimestamp="2025-11-22 03:30:23 +0000 UTC" firstStartedPulling="2025-11-22 03:30:24.422771891 +0000 UTC m=+2188.728789154" lastFinishedPulling="2025-11-22 03:30:24.861799509 +0000 UTC m=+2189.167816802" observedRunningTime="2025-11-22 03:30:25.33986158 +0000 UTC m=+2189.645878893" watchObservedRunningTime="2025-11-22 03:30:29.407147159 +0000 UTC m=+2193.713164462" Nov 22 03:30:31 crc kubenswrapper[4952]: I1122 03:30:31.408222 4952 generic.go:334] "Generic (PLEG): container finished" podID="89b44b40-f505-48de-ada1-e476204fd059" containerID="ca81021b1ee170ed5ad570231022a77ac092e5424b483634f5c1cd27178a2948" exitCode=0 Nov 22 03:30:31 crc kubenswrapper[4952]: I1122 03:30:31.408389 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" event={"ID":"89b44b40-f505-48de-ada1-e476204fd059","Type":"ContainerDied","Data":"ca81021b1ee170ed5ad570231022a77ac092e5424b483634f5c1cd27178a2948"} Nov 22 03:30:32 crc kubenswrapper[4952]: I1122 03:30:32.822882 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" Nov 22 03:30:32 crc kubenswrapper[4952]: I1122 03:30:32.870104 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/89b44b40-f505-48de-ada1-e476204fd059-ceph\") pod \"89b44b40-f505-48de-ada1-e476204fd059\" (UID: \"89b44b40-f505-48de-ada1-e476204fd059\") " Nov 22 03:30:32 crc kubenswrapper[4952]: I1122 03:30:32.870307 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89b44b40-f505-48de-ada1-e476204fd059-ssh-key\") pod \"89b44b40-f505-48de-ada1-e476204fd059\" (UID: \"89b44b40-f505-48de-ada1-e476204fd059\") " Nov 22 03:30:32 crc kubenswrapper[4952]: I1122 03:30:32.871018 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9x82\" (UniqueName: \"kubernetes.io/projected/89b44b40-f505-48de-ada1-e476204fd059-kube-api-access-d9x82\") pod \"89b44b40-f505-48de-ada1-e476204fd059\" (UID: \"89b44b40-f505-48de-ada1-e476204fd059\") " Nov 22 03:30:32 crc kubenswrapper[4952]: I1122 03:30:32.871208 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89b44b40-f505-48de-ada1-e476204fd059-inventory\") pod \"89b44b40-f505-48de-ada1-e476204fd059\" (UID: \"89b44b40-f505-48de-ada1-e476204fd059\") " Nov 22 03:30:32 crc kubenswrapper[4952]: I1122 03:30:32.876492 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89b44b40-f505-48de-ada1-e476204fd059-ceph" (OuterVolumeSpecName: "ceph") pod "89b44b40-f505-48de-ada1-e476204fd059" (UID: "89b44b40-f505-48de-ada1-e476204fd059"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:30:32 crc kubenswrapper[4952]: I1122 03:30:32.888624 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89b44b40-f505-48de-ada1-e476204fd059-kube-api-access-d9x82" (OuterVolumeSpecName: "kube-api-access-d9x82") pod "89b44b40-f505-48de-ada1-e476204fd059" (UID: "89b44b40-f505-48de-ada1-e476204fd059"). InnerVolumeSpecName "kube-api-access-d9x82". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:30:32 crc kubenswrapper[4952]: I1122 03:30:32.899446 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89b44b40-f505-48de-ada1-e476204fd059-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "89b44b40-f505-48de-ada1-e476204fd059" (UID: "89b44b40-f505-48de-ada1-e476204fd059"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:30:32 crc kubenswrapper[4952]: I1122 03:30:32.900940 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89b44b40-f505-48de-ada1-e476204fd059-inventory" (OuterVolumeSpecName: "inventory") pod "89b44b40-f505-48de-ada1-e476204fd059" (UID: "89b44b40-f505-48de-ada1-e476204fd059"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:30:32 crc kubenswrapper[4952]: I1122 03:30:32.974348 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89b44b40-f505-48de-ada1-e476204fd059-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:32 crc kubenswrapper[4952]: I1122 03:30:32.974707 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9x82\" (UniqueName: \"kubernetes.io/projected/89b44b40-f505-48de-ada1-e476204fd059-kube-api-access-d9x82\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:32 crc kubenswrapper[4952]: I1122 03:30:32.974844 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89b44b40-f505-48de-ada1-e476204fd059-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:32 crc kubenswrapper[4952]: I1122 03:30:32.974964 4952 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/89b44b40-f505-48de-ada1-e476204fd059-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.427664 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" event={"ID":"89b44b40-f505-48de-ada1-e476204fd059","Type":"ContainerDied","Data":"902da8aa8fcd6ddbf62b4cf138d214842e05d75569153670f87c06a8405aae59"} Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.427718 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="902da8aa8fcd6ddbf62b4cf138d214842e05d75569153670f87c06a8405aae59" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.427777 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.518656 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm"] Nov 22 03:30:33 crc kubenswrapper[4952]: E1122 03:30:33.519193 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b44b40-f505-48de-ada1-e476204fd059" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.519221 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b44b40-f505-48de-ada1-e476204fd059" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.519498 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b44b40-f505-48de-ada1-e476204fd059" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.520422 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.525726 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.525962 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.526118 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.526259 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.526416 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.534773 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm"] Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.585265 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xqrs\" (UniqueName: \"kubernetes.io/projected/558d097a-8399-4b38-b883-f28c31b108a3-kube-api-access-2xqrs\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9x9cm\" (UID: \"558d097a-8399-4b38-b883-f28c31b108a3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.585335 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/558d097a-8399-4b38-b883-f28c31b108a3-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9x9cm\" (UID: \"558d097a-8399-4b38-b883-f28c31b108a3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.585381 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/558d097a-8399-4b38-b883-f28c31b108a3-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9x9cm\" (UID: \"558d097a-8399-4b38-b883-f28c31b108a3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.585946 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/558d097a-8399-4b38-b883-f28c31b108a3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9x9cm\" (UID: \"558d097a-8399-4b38-b883-f28c31b108a3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.687641 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/558d097a-8399-4b38-b883-f28c31b108a3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9x9cm\" (UID: \"558d097a-8399-4b38-b883-f28c31b108a3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.687708 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xqrs\" (UniqueName: \"kubernetes.io/projected/558d097a-8399-4b38-b883-f28c31b108a3-kube-api-access-2xqrs\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9x9cm\" (UID: \"558d097a-8399-4b38-b883-f28c31b108a3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.687747 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/558d097a-8399-4b38-b883-f28c31b108a3-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9x9cm\" (UID: \"558d097a-8399-4b38-b883-f28c31b108a3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.687801 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/558d097a-8399-4b38-b883-f28c31b108a3-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9x9cm\" (UID: \"558d097a-8399-4b38-b883-f28c31b108a3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.691513 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/558d097a-8399-4b38-b883-f28c31b108a3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9x9cm\" (UID: \"558d097a-8399-4b38-b883-f28c31b108a3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.691899 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/558d097a-8399-4b38-b883-f28c31b108a3-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9x9cm\" (UID: \"558d097a-8399-4b38-b883-f28c31b108a3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.692804 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/558d097a-8399-4b38-b883-f28c31b108a3-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9x9cm\" (UID: \"558d097a-8399-4b38-b883-f28c31b108a3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.709987 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xqrs\" (UniqueName: \"kubernetes.io/projected/558d097a-8399-4b38-b883-f28c31b108a3-kube-api-access-2xqrs\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9x9cm\" (UID: \"558d097a-8399-4b38-b883-f28c31b108a3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm" Nov 22 03:30:33 crc kubenswrapper[4952]: I1122 03:30:33.873835 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm" Nov 22 03:30:34 crc kubenswrapper[4952]: I1122 03:30:34.421622 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm"] Nov 22 03:30:34 crc kubenswrapper[4952]: W1122 03:30:34.429329 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod558d097a_8399_4b38_b883_f28c31b108a3.slice/crio-4857857c948c6be083acd884364ae9863f8f6ffa344d1cd1be14a064be13e6d0 WatchSource:0}: Error finding container 4857857c948c6be083acd884364ae9863f8f6ffa344d1cd1be14a064be13e6d0: Status 404 returned error can't find the container with id 4857857c948c6be083acd884364ae9863f8f6ffa344d1cd1be14a064be13e6d0 Nov 22 03:30:35 crc kubenswrapper[4952]: I1122 03:30:35.448121 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm" event={"ID":"558d097a-8399-4b38-b883-f28c31b108a3","Type":"ContainerStarted","Data":"1ef4269b385f97ddfc4e1fcdcb87e6b7e884a1a83be921025d9a2a946017bb0a"} Nov 22 03:30:35 crc kubenswrapper[4952]: I1122 03:30:35.448503 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm" event={"ID":"558d097a-8399-4b38-b883-f28c31b108a3","Type":"ContainerStarted","Data":"4857857c948c6be083acd884364ae9863f8f6ffa344d1cd1be14a064be13e6d0"} Nov 22 03:30:40 crc kubenswrapper[4952]: I1122 03:30:40.531822 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:30:40 crc kubenswrapper[4952]: E1122 03:30:40.532673 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:30:55 crc kubenswrapper[4952]: I1122 03:30:55.532270 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:30:55 crc kubenswrapper[4952]: E1122 03:30:55.533953 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:31:02 crc kubenswrapper[4952]: I1122 03:31:02.044419 4952 scope.go:117] "RemoveContainer" containerID="80c62495a18d69e9ac1d4e770632cb5979f0f7250b4f5314558c4e4465783306" Nov 22 03:31:06 crc kubenswrapper[4952]: I1122 03:31:06.538971 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:31:06 crc kubenswrapper[4952]: E1122 03:31:06.539824 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:31:17 crc kubenswrapper[4952]: I1122 03:31:17.886657 4952 generic.go:334] "Generic (PLEG): container finished" podID="558d097a-8399-4b38-b883-f28c31b108a3" containerID="1ef4269b385f97ddfc4e1fcdcb87e6b7e884a1a83be921025d9a2a946017bb0a" exitCode=0 Nov 22 03:31:17 crc kubenswrapper[4952]: I1122 03:31:17.886774 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm" event={"ID":"558d097a-8399-4b38-b883-f28c31b108a3","Type":"ContainerDied","Data":"1ef4269b385f97ddfc4e1fcdcb87e6b7e884a1a83be921025d9a2a946017bb0a"} Nov 22 03:31:19 crc kubenswrapper[4952]: I1122 03:31:19.365668 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm" Nov 22 03:31:19 crc kubenswrapper[4952]: I1122 03:31:19.461436 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xqrs\" (UniqueName: \"kubernetes.io/projected/558d097a-8399-4b38-b883-f28c31b108a3-kube-api-access-2xqrs\") pod \"558d097a-8399-4b38-b883-f28c31b108a3\" (UID: \"558d097a-8399-4b38-b883-f28c31b108a3\") " Nov 22 03:31:19 crc kubenswrapper[4952]: I1122 03:31:19.462707 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/558d097a-8399-4b38-b883-f28c31b108a3-ceph\") pod \"558d097a-8399-4b38-b883-f28c31b108a3\" (UID: \"558d097a-8399-4b38-b883-f28c31b108a3\") " Nov 22 03:31:19 crc kubenswrapper[4952]: I1122 03:31:19.462736 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/558d097a-8399-4b38-b883-f28c31b108a3-ssh-key\") pod \"558d097a-8399-4b38-b883-f28c31b108a3\" (UID: \"558d097a-8399-4b38-b883-f28c31b108a3\") " Nov 22 03:31:19 crc kubenswrapper[4952]: I1122 03:31:19.462793 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/558d097a-8399-4b38-b883-f28c31b108a3-inventory\") pod \"558d097a-8399-4b38-b883-f28c31b108a3\" (UID: \"558d097a-8399-4b38-b883-f28c31b108a3\") " Nov 22 03:31:19 crc kubenswrapper[4952]: I1122 03:31:19.470949 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/558d097a-8399-4b38-b883-f28c31b108a3-ceph" (OuterVolumeSpecName: "ceph") pod "558d097a-8399-4b38-b883-f28c31b108a3" (UID: "558d097a-8399-4b38-b883-f28c31b108a3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:31:19 crc kubenswrapper[4952]: I1122 03:31:19.471152 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/558d097a-8399-4b38-b883-f28c31b108a3-kube-api-access-2xqrs" (OuterVolumeSpecName: "kube-api-access-2xqrs") pod "558d097a-8399-4b38-b883-f28c31b108a3" (UID: "558d097a-8399-4b38-b883-f28c31b108a3"). InnerVolumeSpecName "kube-api-access-2xqrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:31:19 crc kubenswrapper[4952]: I1122 03:31:19.499903 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/558d097a-8399-4b38-b883-f28c31b108a3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "558d097a-8399-4b38-b883-f28c31b108a3" (UID: "558d097a-8399-4b38-b883-f28c31b108a3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:31:19 crc kubenswrapper[4952]: I1122 03:31:19.505996 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/558d097a-8399-4b38-b883-f28c31b108a3-inventory" (OuterVolumeSpecName: "inventory") pod "558d097a-8399-4b38-b883-f28c31b108a3" (UID: "558d097a-8399-4b38-b883-f28c31b108a3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:31:19 crc kubenswrapper[4952]: I1122 03:31:19.564019 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/558d097a-8399-4b38-b883-f28c31b108a3-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:19 crc kubenswrapper[4952]: I1122 03:31:19.564056 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xqrs\" (UniqueName: \"kubernetes.io/projected/558d097a-8399-4b38-b883-f28c31b108a3-kube-api-access-2xqrs\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:19 crc kubenswrapper[4952]: I1122 03:31:19.564071 4952 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/558d097a-8399-4b38-b883-f28c31b108a3-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:19 crc kubenswrapper[4952]: I1122 03:31:19.564082 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/558d097a-8399-4b38-b883-f28c31b108a3-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:19 crc kubenswrapper[4952]: I1122 03:31:19.910455 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm" event={"ID":"558d097a-8399-4b38-b883-f28c31b108a3","Type":"ContainerDied","Data":"4857857c948c6be083acd884364ae9863f8f6ffa344d1cd1be14a064be13e6d0"} Nov 22 03:31:19 crc kubenswrapper[4952]: I1122 03:31:19.910503 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4857857c948c6be083acd884364ae9863f8f6ffa344d1cd1be14a064be13e6d0" Nov 22 03:31:19 crc kubenswrapper[4952]: I1122 03:31:19.910516 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9x9cm" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.007648 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl"] Nov 22 03:31:20 crc kubenswrapper[4952]: E1122 03:31:20.007983 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="558d097a-8399-4b38-b883-f28c31b108a3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.008002 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="558d097a-8399-4b38-b883-f28c31b108a3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.008223 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="558d097a-8399-4b38-b883-f28c31b108a3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.008818 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.011303 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.011436 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.011478 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.012133 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.013801 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.023687 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl"] Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.072273 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzrbb\" (UniqueName: \"kubernetes.io/projected/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-kube-api-access-qzrbb\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl\" (UID: \"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.072636 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl\" (UID: \"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.072747 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl\" (UID: \"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.072814 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl\" (UID: \"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.175356 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl\" (UID: \"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.175506 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzrbb\" (UniqueName: \"kubernetes.io/projected/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-kube-api-access-qzrbb\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl\" (UID: \"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.175622 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl\" (UID: \"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.175661 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl\" (UID: \"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.189201 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl\" (UID: \"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.189900 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl\" (UID: \"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.204242 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl\" (UID: \"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.212393 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzrbb\" (UniqueName: \"kubernetes.io/projected/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-kube-api-access-qzrbb\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl\" (UID: \"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.347987 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.531870 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:31:20 crc kubenswrapper[4952]: E1122 03:31:20.532058 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.730600 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl"] Nov 22 03:31:20 crc kubenswrapper[4952]: I1122 03:31:20.920936 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" event={"ID":"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4","Type":"ContainerStarted","Data":"762f6bce7ad4fdaffdc328d9712141523b3179b7926cd552e2520903ba73cac9"} Nov 22 03:31:21 crc kubenswrapper[4952]: I1122 03:31:21.933834 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" event={"ID":"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4","Type":"ContainerStarted","Data":"062fed8f0fcf08a3d434f340cba90dcc4f882d5762dd3e174659ce167d4c1d5b"} Nov 22 03:31:21 crc kubenswrapper[4952]: I1122 03:31:21.963312 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" podStartSLOduration=2.244770512 podStartE2EDuration="2.963279823s" podCreationTimestamp="2025-11-22 03:31:19 +0000 UTC" firstStartedPulling="2025-11-22 03:31:20.711951867 +0000 UTC m=+2245.017969130" lastFinishedPulling="2025-11-22 03:31:21.430461148 +0000 UTC m=+2245.736478441" observedRunningTime="2025-11-22 03:31:21.956687258 +0000 UTC m=+2246.262704591" watchObservedRunningTime="2025-11-22 03:31:21.963279823 +0000 UTC m=+2246.269297136" Nov 22 03:31:25 crc kubenswrapper[4952]: I1122 03:31:25.977291 4952 generic.go:334] "Generic (PLEG): container finished" podID="c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4" containerID="062fed8f0fcf08a3d434f340cba90dcc4f882d5762dd3e174659ce167d4c1d5b" exitCode=0 Nov 22 03:31:25 crc kubenswrapper[4952]: I1122 03:31:25.977406 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" event={"ID":"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4","Type":"ContainerDied","Data":"062fed8f0fcf08a3d434f340cba90dcc4f882d5762dd3e174659ce167d4c1d5b"} Nov 22 03:31:27 crc kubenswrapper[4952]: I1122 03:31:27.437604 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" Nov 22 03:31:27 crc kubenswrapper[4952]: I1122 03:31:27.529725 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-ssh-key\") pod \"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4\" (UID: \"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4\") " Nov 22 03:31:27 crc kubenswrapper[4952]: I1122 03:31:27.530003 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzrbb\" (UniqueName: \"kubernetes.io/projected/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-kube-api-access-qzrbb\") pod \"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4\" (UID: \"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4\") " Nov 22 03:31:27 crc kubenswrapper[4952]: I1122 03:31:27.530026 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-ceph\") pod \"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4\" (UID: \"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4\") " Nov 22 03:31:27 crc kubenswrapper[4952]: I1122 03:31:27.530050 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-inventory\") pod \"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4\" (UID: \"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4\") " Nov 22 03:31:27 crc kubenswrapper[4952]: I1122 03:31:27.537254 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-ceph" (OuterVolumeSpecName: "ceph") pod "c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4" (UID: "c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:31:27 crc kubenswrapper[4952]: I1122 03:31:27.537458 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-kube-api-access-qzrbb" (OuterVolumeSpecName: "kube-api-access-qzrbb") pod "c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4" (UID: "c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4"). InnerVolumeSpecName "kube-api-access-qzrbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:31:27 crc kubenswrapper[4952]: I1122 03:31:27.568792 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4" (UID: "c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:31:27 crc kubenswrapper[4952]: I1122 03:31:27.595701 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-inventory" (OuterVolumeSpecName: "inventory") pod "c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4" (UID: "c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:31:27 crc kubenswrapper[4952]: I1122 03:31:27.632464 4952 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:27 crc kubenswrapper[4952]: I1122 03:31:27.632506 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzrbb\" (UniqueName: \"kubernetes.io/projected/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-kube-api-access-qzrbb\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:27 crc kubenswrapper[4952]: I1122 03:31:27.632523 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:27 crc kubenswrapper[4952]: I1122 03:31:27.632535 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:27 crc kubenswrapper[4952]: I1122 03:31:27.993304 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" event={"ID":"c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4","Type":"ContainerDied","Data":"762f6bce7ad4fdaffdc328d9712141523b3179b7926cd552e2520903ba73cac9"} Nov 22 03:31:27 crc kubenswrapper[4952]: I1122 03:31:27.993765 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="762f6bce7ad4fdaffdc328d9712141523b3179b7926cd552e2520903ba73cac9" Nov 22 03:31:27 crc kubenswrapper[4952]: I1122 03:31:27.993409 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.085455 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k"] Nov 22 03:31:28 crc kubenswrapper[4952]: E1122 03:31:28.085843 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.085864 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.086045 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.086718 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.090051 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.090242 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.092178 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.092510 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.094952 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.103221 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k"] Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.244282 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/145d85a9-5de9-42d1-b463-d195f016e395-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gz55k\" (UID: \"145d85a9-5de9-42d1-b463-d195f016e395\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.244333 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrggq\" (UniqueName: \"kubernetes.io/projected/145d85a9-5de9-42d1-b463-d195f016e395-kube-api-access-xrggq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gz55k\" (UID: \"145d85a9-5de9-42d1-b463-d195f016e395\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.244488 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/145d85a9-5de9-42d1-b463-d195f016e395-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gz55k\" (UID: \"145d85a9-5de9-42d1-b463-d195f016e395\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.244733 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/145d85a9-5de9-42d1-b463-d195f016e395-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gz55k\" (UID: \"145d85a9-5de9-42d1-b463-d195f016e395\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.346622 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/145d85a9-5de9-42d1-b463-d195f016e395-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gz55k\" (UID: \"145d85a9-5de9-42d1-b463-d195f016e395\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.346662 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrggq\" (UniqueName: \"kubernetes.io/projected/145d85a9-5de9-42d1-b463-d195f016e395-kube-api-access-xrggq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gz55k\" (UID: \"145d85a9-5de9-42d1-b463-d195f016e395\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.346730 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/145d85a9-5de9-42d1-b463-d195f016e395-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gz55k\" (UID: \"145d85a9-5de9-42d1-b463-d195f016e395\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.346783 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/145d85a9-5de9-42d1-b463-d195f016e395-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gz55k\" (UID: \"145d85a9-5de9-42d1-b463-d195f016e395\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.350840 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/145d85a9-5de9-42d1-b463-d195f016e395-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gz55k\" (UID: \"145d85a9-5de9-42d1-b463-d195f016e395\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.350927 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/145d85a9-5de9-42d1-b463-d195f016e395-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gz55k\" (UID: \"145d85a9-5de9-42d1-b463-d195f016e395\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.351050 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/145d85a9-5de9-42d1-b463-d195f016e395-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gz55k\" (UID: \"145d85a9-5de9-42d1-b463-d195f016e395\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.368447 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrggq\" (UniqueName: \"kubernetes.io/projected/145d85a9-5de9-42d1-b463-d195f016e395-kube-api-access-xrggq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gz55k\" (UID: \"145d85a9-5de9-42d1-b463-d195f016e395\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.416759 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" Nov 22 03:31:28 crc kubenswrapper[4952]: I1122 03:31:28.983884 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k"] Nov 22 03:31:30 crc kubenswrapper[4952]: I1122 03:31:30.016243 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" event={"ID":"145d85a9-5de9-42d1-b463-d195f016e395","Type":"ContainerStarted","Data":"4181a4d0ac28866f8d1b8fd938462d0e29a983fe127e43c7bd0dba1a727e4821"} Nov 22 03:31:30 crc kubenswrapper[4952]: I1122 03:31:30.016852 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" event={"ID":"145d85a9-5de9-42d1-b463-d195f016e395","Type":"ContainerStarted","Data":"9558d518a313171001560d567324f66630788d0829358f5a511b0d79f9d2a67a"} Nov 22 03:31:30 crc kubenswrapper[4952]: I1122 03:31:30.043603 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" podStartSLOduration=1.607728237 podStartE2EDuration="2.043576149s" podCreationTimestamp="2025-11-22 03:31:28 +0000 UTC" firstStartedPulling="2025-11-22 03:31:28.996474061 +0000 UTC m=+2253.302491374" lastFinishedPulling="2025-11-22 03:31:29.432321993 +0000 UTC m=+2253.738339286" observedRunningTime="2025-11-22 03:31:30.038631628 +0000 UTC m=+2254.344648931" watchObservedRunningTime="2025-11-22 03:31:30.043576149 +0000 UTC m=+2254.349593452" Nov 22 03:31:34 crc kubenswrapper[4952]: I1122 03:31:34.532406 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:31:34 crc kubenswrapper[4952]: E1122 03:31:34.534820 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:31:47 crc kubenswrapper[4952]: I1122 03:31:47.531009 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:31:47 crc kubenswrapper[4952]: E1122 03:31:47.531837 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:31:59 crc kubenswrapper[4952]: I1122 03:31:59.535743 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:31:59 crc kubenswrapper[4952]: E1122 03:31:59.536892 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:32:13 crc kubenswrapper[4952]: I1122 03:32:13.532730 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:32:13 crc kubenswrapper[4952]: E1122 03:32:13.533631 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:32:17 crc kubenswrapper[4952]: I1122 03:32:17.479457 4952 generic.go:334] "Generic (PLEG): container finished" podID="145d85a9-5de9-42d1-b463-d195f016e395" containerID="4181a4d0ac28866f8d1b8fd938462d0e29a983fe127e43c7bd0dba1a727e4821" exitCode=0 Nov 22 03:32:17 crc kubenswrapper[4952]: I1122 03:32:17.479582 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" event={"ID":"145d85a9-5de9-42d1-b463-d195f016e395","Type":"ContainerDied","Data":"4181a4d0ac28866f8d1b8fd938462d0e29a983fe127e43c7bd0dba1a727e4821"} Nov 22 03:32:18 crc kubenswrapper[4952]: I1122 03:32:18.950521 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.051934 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/145d85a9-5de9-42d1-b463-d195f016e395-ssh-key\") pod \"145d85a9-5de9-42d1-b463-d195f016e395\" (UID: \"145d85a9-5de9-42d1-b463-d195f016e395\") " Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.052079 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/145d85a9-5de9-42d1-b463-d195f016e395-inventory\") pod \"145d85a9-5de9-42d1-b463-d195f016e395\" (UID: \"145d85a9-5de9-42d1-b463-d195f016e395\") " Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.052112 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/145d85a9-5de9-42d1-b463-d195f016e395-ceph\") pod \"145d85a9-5de9-42d1-b463-d195f016e395\" (UID: \"145d85a9-5de9-42d1-b463-d195f016e395\") " Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.052222 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrggq\" (UniqueName: \"kubernetes.io/projected/145d85a9-5de9-42d1-b463-d195f016e395-kube-api-access-xrggq\") pod \"145d85a9-5de9-42d1-b463-d195f016e395\" (UID: \"145d85a9-5de9-42d1-b463-d195f016e395\") " Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.060429 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145d85a9-5de9-42d1-b463-d195f016e395-ceph" (OuterVolumeSpecName: "ceph") pod "145d85a9-5de9-42d1-b463-d195f016e395" (UID: "145d85a9-5de9-42d1-b463-d195f016e395"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.060523 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/145d85a9-5de9-42d1-b463-d195f016e395-kube-api-access-xrggq" (OuterVolumeSpecName: "kube-api-access-xrggq") pod "145d85a9-5de9-42d1-b463-d195f016e395" (UID: "145d85a9-5de9-42d1-b463-d195f016e395"). InnerVolumeSpecName "kube-api-access-xrggq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.087053 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145d85a9-5de9-42d1-b463-d195f016e395-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "145d85a9-5de9-42d1-b463-d195f016e395" (UID: "145d85a9-5de9-42d1-b463-d195f016e395"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.093791 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145d85a9-5de9-42d1-b463-d195f016e395-inventory" (OuterVolumeSpecName: "inventory") pod "145d85a9-5de9-42d1-b463-d195f016e395" (UID: "145d85a9-5de9-42d1-b463-d195f016e395"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.154502 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/145d85a9-5de9-42d1-b463-d195f016e395-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.154556 4952 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/145d85a9-5de9-42d1-b463-d195f016e395-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.154567 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrggq\" (UniqueName: \"kubernetes.io/projected/145d85a9-5de9-42d1-b463-d195f016e395-kube-api-access-xrggq\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.154579 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/145d85a9-5de9-42d1-b463-d195f016e395-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.501656 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" event={"ID":"145d85a9-5de9-42d1-b463-d195f016e395","Type":"ContainerDied","Data":"9558d518a313171001560d567324f66630788d0829358f5a511b0d79f9d2a67a"} Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.502014 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9558d518a313171001560d567324f66630788d0829358f5a511b0d79f9d2a67a" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.501771 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gz55k" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.611606 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p9xll"] Nov 22 03:32:19 crc kubenswrapper[4952]: E1122 03:32:19.612000 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="145d85a9-5de9-42d1-b463-d195f016e395" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.612021 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="145d85a9-5de9-42d1-b463-d195f016e395" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.612250 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="145d85a9-5de9-42d1-b463-d195f016e395" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.613100 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.615450 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.615701 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.621035 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.621092 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.621034 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.625821 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p9xll"] Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.768304 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm9wx\" (UniqueName: \"kubernetes.io/projected/87882231-d670-44b0-bd0f-9d637b9ddc98-kube-api-access-lm9wx\") pod \"ssh-known-hosts-edpm-deployment-p9xll\" (UID: \"87882231-d670-44b0-bd0f-9d637b9ddc98\") " pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.768376 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/87882231-d670-44b0-bd0f-9d637b9ddc98-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p9xll\" (UID: \"87882231-d670-44b0-bd0f-9d637b9ddc98\") " pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.768440 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87882231-d670-44b0-bd0f-9d637b9ddc98-ceph\") pod \"ssh-known-hosts-edpm-deployment-p9xll\" (UID: \"87882231-d670-44b0-bd0f-9d637b9ddc98\") " pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.768497 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/87882231-d670-44b0-bd0f-9d637b9ddc98-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p9xll\" (UID: \"87882231-d670-44b0-bd0f-9d637b9ddc98\") " pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.870090 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/87882231-d670-44b0-bd0f-9d637b9ddc98-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p9xll\" (UID: \"87882231-d670-44b0-bd0f-9d637b9ddc98\") " pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.870199 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm9wx\" (UniqueName: \"kubernetes.io/projected/87882231-d670-44b0-bd0f-9d637b9ddc98-kube-api-access-lm9wx\") pod \"ssh-known-hosts-edpm-deployment-p9xll\" (UID: \"87882231-d670-44b0-bd0f-9d637b9ddc98\") " pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.870246 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/87882231-d670-44b0-bd0f-9d637b9ddc98-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p9xll\" (UID: \"87882231-d670-44b0-bd0f-9d637b9ddc98\") " pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.870303 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87882231-d670-44b0-bd0f-9d637b9ddc98-ceph\") pod \"ssh-known-hosts-edpm-deployment-p9xll\" (UID: \"87882231-d670-44b0-bd0f-9d637b9ddc98\") " pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.875600 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/87882231-d670-44b0-bd0f-9d637b9ddc98-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p9xll\" (UID: \"87882231-d670-44b0-bd0f-9d637b9ddc98\") " pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.876031 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/87882231-d670-44b0-bd0f-9d637b9ddc98-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p9xll\" (UID: \"87882231-d670-44b0-bd0f-9d637b9ddc98\") " pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.879572 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87882231-d670-44b0-bd0f-9d637b9ddc98-ceph\") pod \"ssh-known-hosts-edpm-deployment-p9xll\" (UID: \"87882231-d670-44b0-bd0f-9d637b9ddc98\") " pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.898601 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm9wx\" (UniqueName: \"kubernetes.io/projected/87882231-d670-44b0-bd0f-9d637b9ddc98-kube-api-access-lm9wx\") pod \"ssh-known-hosts-edpm-deployment-p9xll\" (UID: \"87882231-d670-44b0-bd0f-9d637b9ddc98\") " pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" Nov 22 03:32:19 crc kubenswrapper[4952]: I1122 03:32:19.936189 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" Nov 22 03:32:20 crc kubenswrapper[4952]: I1122 03:32:20.513033 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p9xll"] Nov 22 03:32:21 crc kubenswrapper[4952]: I1122 03:32:21.526087 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" event={"ID":"87882231-d670-44b0-bd0f-9d637b9ddc98","Type":"ContainerStarted","Data":"be25f24e26001b233600943cb8476f322b2a32009ebc0048e75d18cee6e47cb7"} Nov 22 03:32:21 crc kubenswrapper[4952]: I1122 03:32:21.526578 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" event={"ID":"87882231-d670-44b0-bd0f-9d637b9ddc98","Type":"ContainerStarted","Data":"54f4495289046b771254463b3d738fbff93dfb205dbd34dcdbbe989f0f2bbcbc"} Nov 22 03:32:21 crc kubenswrapper[4952]: I1122 03:32:21.546189 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" podStartSLOduration=2.003391765 podStartE2EDuration="2.546173405s" podCreationTimestamp="2025-11-22 03:32:19 +0000 UTC" firstStartedPulling="2025-11-22 03:32:20.516532451 +0000 UTC m=+2304.822549724" lastFinishedPulling="2025-11-22 03:32:21.059314051 +0000 UTC m=+2305.365331364" observedRunningTime="2025-11-22 03:32:21.543943117 +0000 UTC m=+2305.849960390" watchObservedRunningTime="2025-11-22 03:32:21.546173405 +0000 UTC m=+2305.852190678" Nov 22 03:32:25 crc kubenswrapper[4952]: I1122 03:32:25.532155 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:32:25 crc kubenswrapper[4952]: E1122 03:32:25.533159 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:32:31 crc kubenswrapper[4952]: I1122 03:32:31.633839 4952 generic.go:334] "Generic (PLEG): container finished" podID="87882231-d670-44b0-bd0f-9d637b9ddc98" containerID="be25f24e26001b233600943cb8476f322b2a32009ebc0048e75d18cee6e47cb7" exitCode=0 Nov 22 03:32:31 crc kubenswrapper[4952]: I1122 03:32:31.633913 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" event={"ID":"87882231-d670-44b0-bd0f-9d637b9ddc98","Type":"ContainerDied","Data":"be25f24e26001b233600943cb8476f322b2a32009ebc0048e75d18cee6e47cb7"} Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.117164 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.188877 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/87882231-d670-44b0-bd0f-9d637b9ddc98-inventory-0\") pod \"87882231-d670-44b0-bd0f-9d637b9ddc98\" (UID: \"87882231-d670-44b0-bd0f-9d637b9ddc98\") " Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.188955 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/87882231-d670-44b0-bd0f-9d637b9ddc98-ssh-key-openstack-edpm-ipam\") pod \"87882231-d670-44b0-bd0f-9d637b9ddc98\" (UID: \"87882231-d670-44b0-bd0f-9d637b9ddc98\") " Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.189050 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm9wx\" (UniqueName: \"kubernetes.io/projected/87882231-d670-44b0-bd0f-9d637b9ddc98-kube-api-access-lm9wx\") pod \"87882231-d670-44b0-bd0f-9d637b9ddc98\" (UID: \"87882231-d670-44b0-bd0f-9d637b9ddc98\") " Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.189083 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87882231-d670-44b0-bd0f-9d637b9ddc98-ceph\") pod \"87882231-d670-44b0-bd0f-9d637b9ddc98\" (UID: \"87882231-d670-44b0-bd0f-9d637b9ddc98\") " Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.196668 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87882231-d670-44b0-bd0f-9d637b9ddc98-kube-api-access-lm9wx" (OuterVolumeSpecName: "kube-api-access-lm9wx") pod "87882231-d670-44b0-bd0f-9d637b9ddc98" (UID: "87882231-d670-44b0-bd0f-9d637b9ddc98"). InnerVolumeSpecName "kube-api-access-lm9wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.198370 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87882231-d670-44b0-bd0f-9d637b9ddc98-ceph" (OuterVolumeSpecName: "ceph") pod "87882231-d670-44b0-bd0f-9d637b9ddc98" (UID: "87882231-d670-44b0-bd0f-9d637b9ddc98"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.214390 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87882231-d670-44b0-bd0f-9d637b9ddc98-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "87882231-d670-44b0-bd0f-9d637b9ddc98" (UID: "87882231-d670-44b0-bd0f-9d637b9ddc98"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.217794 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87882231-d670-44b0-bd0f-9d637b9ddc98-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "87882231-d670-44b0-bd0f-9d637b9ddc98" (UID: "87882231-d670-44b0-bd0f-9d637b9ddc98"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.291192 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/87882231-d670-44b0-bd0f-9d637b9ddc98-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.291235 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm9wx\" (UniqueName: \"kubernetes.io/projected/87882231-d670-44b0-bd0f-9d637b9ddc98-kube-api-access-lm9wx\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.291247 4952 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87882231-d670-44b0-bd0f-9d637b9ddc98-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.291261 4952 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/87882231-d670-44b0-bd0f-9d637b9ddc98-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.658308 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" event={"ID":"87882231-d670-44b0-bd0f-9d637b9ddc98","Type":"ContainerDied","Data":"54f4495289046b771254463b3d738fbff93dfb205dbd34dcdbbe989f0f2bbcbc"} Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.658366 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54f4495289046b771254463b3d738fbff93dfb205dbd34dcdbbe989f0f2bbcbc" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.658337 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p9xll" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.751738 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs"] Nov 22 03:32:33 crc kubenswrapper[4952]: E1122 03:32:33.752306 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87882231-d670-44b0-bd0f-9d637b9ddc98" containerName="ssh-known-hosts-edpm-deployment" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.752356 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="87882231-d670-44b0-bd0f-9d637b9ddc98" containerName="ssh-known-hosts-edpm-deployment" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.752711 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="87882231-d670-44b0-bd0f-9d637b9ddc98" containerName="ssh-known-hosts-edpm-deployment" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.753695 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.757368 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.757433 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.762847 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.763023 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.765916 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.772805 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs"] Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.800398 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b951af39-2eb8-430d-933b-121f858c322c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rrshs\" (UID: \"b951af39-2eb8-430d-933b-121f858c322c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.800741 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b951af39-2eb8-430d-933b-121f858c322c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rrshs\" (UID: \"b951af39-2eb8-430d-933b-121f858c322c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.801000 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrcq7\" (UniqueName: \"kubernetes.io/projected/b951af39-2eb8-430d-933b-121f858c322c-kube-api-access-nrcq7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rrshs\" (UID: \"b951af39-2eb8-430d-933b-121f858c322c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.801122 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b951af39-2eb8-430d-933b-121f858c322c-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rrshs\" (UID: \"b951af39-2eb8-430d-933b-121f858c322c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.902588 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrcq7\" (UniqueName: \"kubernetes.io/projected/b951af39-2eb8-430d-933b-121f858c322c-kube-api-access-nrcq7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rrshs\" (UID: \"b951af39-2eb8-430d-933b-121f858c322c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.902882 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b951af39-2eb8-430d-933b-121f858c322c-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rrshs\" (UID: \"b951af39-2eb8-430d-933b-121f858c322c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.903049 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b951af39-2eb8-430d-933b-121f858c322c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rrshs\" (UID: \"b951af39-2eb8-430d-933b-121f858c322c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.903187 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b951af39-2eb8-430d-933b-121f858c322c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rrshs\" (UID: \"b951af39-2eb8-430d-933b-121f858c322c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.907188 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b951af39-2eb8-430d-933b-121f858c322c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rrshs\" (UID: \"b951af39-2eb8-430d-933b-121f858c322c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.907242 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b951af39-2eb8-430d-933b-121f858c322c-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rrshs\" (UID: \"b951af39-2eb8-430d-933b-121f858c322c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.907695 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b951af39-2eb8-430d-933b-121f858c322c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rrshs\" (UID: \"b951af39-2eb8-430d-933b-121f858c322c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" Nov 22 03:32:33 crc kubenswrapper[4952]: I1122 03:32:33.923002 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrcq7\" (UniqueName: \"kubernetes.io/projected/b951af39-2eb8-430d-933b-121f858c322c-kube-api-access-nrcq7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rrshs\" (UID: \"b951af39-2eb8-430d-933b-121f858c322c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" Nov 22 03:32:34 crc kubenswrapper[4952]: I1122 03:32:34.078015 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" Nov 22 03:32:34 crc kubenswrapper[4952]: I1122 03:32:34.746426 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs"] Nov 22 03:32:34 crc kubenswrapper[4952]: W1122 03:32:34.752916 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb951af39_2eb8_430d_933b_121f858c322c.slice/crio-8983cb0ad785c00580b98b985300bde9684ef869dfc958d64a1a4ef681d9a3c3 WatchSource:0}: Error finding container 8983cb0ad785c00580b98b985300bde9684ef869dfc958d64a1a4ef681d9a3c3: Status 404 returned error can't find the container with id 8983cb0ad785c00580b98b985300bde9684ef869dfc958d64a1a4ef681d9a3c3 Nov 22 03:32:35 crc kubenswrapper[4952]: I1122 03:32:35.679228 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" event={"ID":"b951af39-2eb8-430d-933b-121f858c322c","Type":"ContainerStarted","Data":"106b365ed964cf5c1662c94f7fb645fff3a0a97dca173a9bd06e033544f69570"} Nov 22 03:32:35 crc kubenswrapper[4952]: I1122 03:32:35.679621 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" event={"ID":"b951af39-2eb8-430d-933b-121f858c322c","Type":"ContainerStarted","Data":"8983cb0ad785c00580b98b985300bde9684ef869dfc958d64a1a4ef681d9a3c3"} Nov 22 03:32:35 crc kubenswrapper[4952]: I1122 03:32:35.705347 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" podStartSLOduration=2.255871869 podStartE2EDuration="2.705322242s" podCreationTimestamp="2025-11-22 03:32:33 +0000 UTC" firstStartedPulling="2025-11-22 03:32:34.755211147 +0000 UTC m=+2319.061228420" lastFinishedPulling="2025-11-22 03:32:35.20466148 +0000 UTC m=+2319.510678793" observedRunningTime="2025-11-22 03:32:35.704044108 +0000 UTC m=+2320.010061381" watchObservedRunningTime="2025-11-22 03:32:35.705322242 +0000 UTC m=+2320.011339535" Nov 22 03:32:39 crc kubenswrapper[4952]: I1122 03:32:39.531160 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:32:39 crc kubenswrapper[4952]: E1122 03:32:39.532169 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:32:43 crc kubenswrapper[4952]: I1122 03:32:43.781427 4952 generic.go:334] "Generic (PLEG): container finished" podID="b951af39-2eb8-430d-933b-121f858c322c" containerID="106b365ed964cf5c1662c94f7fb645fff3a0a97dca173a9bd06e033544f69570" exitCode=0 Nov 22 03:32:43 crc kubenswrapper[4952]: I1122 03:32:43.781615 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" event={"ID":"b951af39-2eb8-430d-933b-121f858c322c","Type":"ContainerDied","Data":"106b365ed964cf5c1662c94f7fb645fff3a0a97dca173a9bd06e033544f69570"} Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.211744 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.344578 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b951af39-2eb8-430d-933b-121f858c322c-ssh-key\") pod \"b951af39-2eb8-430d-933b-121f858c322c\" (UID: \"b951af39-2eb8-430d-933b-121f858c322c\") " Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.344642 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrcq7\" (UniqueName: \"kubernetes.io/projected/b951af39-2eb8-430d-933b-121f858c322c-kube-api-access-nrcq7\") pod \"b951af39-2eb8-430d-933b-121f858c322c\" (UID: \"b951af39-2eb8-430d-933b-121f858c322c\") " Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.344749 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b951af39-2eb8-430d-933b-121f858c322c-ceph\") pod \"b951af39-2eb8-430d-933b-121f858c322c\" (UID: \"b951af39-2eb8-430d-933b-121f858c322c\") " Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.344859 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b951af39-2eb8-430d-933b-121f858c322c-inventory\") pod \"b951af39-2eb8-430d-933b-121f858c322c\" (UID: \"b951af39-2eb8-430d-933b-121f858c322c\") " Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.351140 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b951af39-2eb8-430d-933b-121f858c322c-ceph" (OuterVolumeSpecName: "ceph") pod "b951af39-2eb8-430d-933b-121f858c322c" (UID: "b951af39-2eb8-430d-933b-121f858c322c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.351679 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b951af39-2eb8-430d-933b-121f858c322c-kube-api-access-nrcq7" (OuterVolumeSpecName: "kube-api-access-nrcq7") pod "b951af39-2eb8-430d-933b-121f858c322c" (UID: "b951af39-2eb8-430d-933b-121f858c322c"). InnerVolumeSpecName "kube-api-access-nrcq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.370917 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b951af39-2eb8-430d-933b-121f858c322c-inventory" (OuterVolumeSpecName: "inventory") pod "b951af39-2eb8-430d-933b-121f858c322c" (UID: "b951af39-2eb8-430d-933b-121f858c322c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.398311 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b951af39-2eb8-430d-933b-121f858c322c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b951af39-2eb8-430d-933b-121f858c322c" (UID: "b951af39-2eb8-430d-933b-121f858c322c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.447430 4952 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b951af39-2eb8-430d-933b-121f858c322c-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.447470 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b951af39-2eb8-430d-933b-121f858c322c-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.447485 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b951af39-2eb8-430d-933b-121f858c322c-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.447499 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrcq7\" (UniqueName: \"kubernetes.io/projected/b951af39-2eb8-430d-933b-121f858c322c-kube-api-access-nrcq7\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.803278 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" event={"ID":"b951af39-2eb8-430d-933b-121f858c322c","Type":"ContainerDied","Data":"8983cb0ad785c00580b98b985300bde9684ef869dfc958d64a1a4ef681d9a3c3"} Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.803342 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8983cb0ad785c00580b98b985300bde9684ef869dfc958d64a1a4ef681d9a3c3" Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.803387 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rrshs" Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.980821 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz"] Nov 22 03:32:45 crc kubenswrapper[4952]: E1122 03:32:45.981364 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b951af39-2eb8-430d-933b-121f858c322c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.981599 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="b951af39-2eb8-430d-933b-121f858c322c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.981823 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="b951af39-2eb8-430d-933b-121f858c322c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.982702 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.984945 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.985042 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.985114 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.985154 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.990052 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:32:45 crc kubenswrapper[4952]: I1122 03:32:45.999031 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz"] Nov 22 03:32:46 crc kubenswrapper[4952]: I1122 03:32:46.162948 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6908216d-a851-4626-a77f-c24f71c10f97-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz\" (UID: \"6908216d-a851-4626-a77f-c24f71c10f97\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" Nov 22 03:32:46 crc kubenswrapper[4952]: I1122 03:32:46.163009 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6908216d-a851-4626-a77f-c24f71c10f97-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz\" (UID: \"6908216d-a851-4626-a77f-c24f71c10f97\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" Nov 22 03:32:46 crc kubenswrapper[4952]: I1122 03:32:46.163084 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpw5g\" (UniqueName: \"kubernetes.io/projected/6908216d-a851-4626-a77f-c24f71c10f97-kube-api-access-fpw5g\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz\" (UID: \"6908216d-a851-4626-a77f-c24f71c10f97\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" Nov 22 03:32:46 crc kubenswrapper[4952]: I1122 03:32:46.163124 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6908216d-a851-4626-a77f-c24f71c10f97-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz\" (UID: \"6908216d-a851-4626-a77f-c24f71c10f97\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" Nov 22 03:32:46 crc kubenswrapper[4952]: I1122 03:32:46.265371 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpw5g\" (UniqueName: \"kubernetes.io/projected/6908216d-a851-4626-a77f-c24f71c10f97-kube-api-access-fpw5g\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz\" (UID: \"6908216d-a851-4626-a77f-c24f71c10f97\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" Nov 22 03:32:46 crc kubenswrapper[4952]: I1122 03:32:46.265435 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6908216d-a851-4626-a77f-c24f71c10f97-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz\" (UID: \"6908216d-a851-4626-a77f-c24f71c10f97\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" Nov 22 03:32:46 crc kubenswrapper[4952]: I1122 03:32:46.265558 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6908216d-a851-4626-a77f-c24f71c10f97-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz\" (UID: \"6908216d-a851-4626-a77f-c24f71c10f97\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" Nov 22 03:32:46 crc kubenswrapper[4952]: I1122 03:32:46.265599 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6908216d-a851-4626-a77f-c24f71c10f97-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz\" (UID: \"6908216d-a851-4626-a77f-c24f71c10f97\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" Nov 22 03:32:46 crc kubenswrapper[4952]: I1122 03:32:46.270852 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6908216d-a851-4626-a77f-c24f71c10f97-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz\" (UID: \"6908216d-a851-4626-a77f-c24f71c10f97\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" Nov 22 03:32:46 crc kubenswrapper[4952]: I1122 03:32:46.283279 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6908216d-a851-4626-a77f-c24f71c10f97-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz\" (UID: \"6908216d-a851-4626-a77f-c24f71c10f97\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" Nov 22 03:32:46 crc kubenswrapper[4952]: I1122 03:32:46.283692 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6908216d-a851-4626-a77f-c24f71c10f97-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz\" (UID: \"6908216d-a851-4626-a77f-c24f71c10f97\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" Nov 22 03:32:46 crc kubenswrapper[4952]: I1122 03:32:46.287256 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpw5g\" (UniqueName: \"kubernetes.io/projected/6908216d-a851-4626-a77f-c24f71c10f97-kube-api-access-fpw5g\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz\" (UID: \"6908216d-a851-4626-a77f-c24f71c10f97\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" Nov 22 03:32:46 crc kubenswrapper[4952]: I1122 03:32:46.306056 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" Nov 22 03:32:46 crc kubenswrapper[4952]: I1122 03:32:46.650168 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz"] Nov 22 03:32:46 crc kubenswrapper[4952]: I1122 03:32:46.652700 4952 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 03:32:46 crc kubenswrapper[4952]: I1122 03:32:46.813003 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" event={"ID":"6908216d-a851-4626-a77f-c24f71c10f97","Type":"ContainerStarted","Data":"78ad5031f7b980977e97869a5c703936e515f1e37521b57bb59336afca6a8d52"} Nov 22 03:32:47 crc kubenswrapper[4952]: I1122 03:32:47.823330 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" event={"ID":"6908216d-a851-4626-a77f-c24f71c10f97","Type":"ContainerStarted","Data":"9d5eb8a46fbfa72cb97f79bc4f73e4e0ce2f43d84f4df31d2086bf65bc173fc4"} Nov 22 03:32:47 crc kubenswrapper[4952]: I1122 03:32:47.852605 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" podStartSLOduration=2.43794625 podStartE2EDuration="2.852578799s" podCreationTimestamp="2025-11-22 03:32:45 +0000 UTC" firstStartedPulling="2025-11-22 03:32:46.652473652 +0000 UTC m=+2330.958490925" lastFinishedPulling="2025-11-22 03:32:47.067106171 +0000 UTC m=+2331.373123474" observedRunningTime="2025-11-22 03:32:47.841481484 +0000 UTC m=+2332.147498767" watchObservedRunningTime="2025-11-22 03:32:47.852578799 +0000 UTC m=+2332.158596122" Nov 22 03:32:50 crc kubenswrapper[4952]: I1122 03:32:50.532288 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:32:50 crc kubenswrapper[4952]: E1122 03:32:50.533419 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:32:57 crc kubenswrapper[4952]: I1122 03:32:57.921955 4952 generic.go:334] "Generic (PLEG): container finished" podID="6908216d-a851-4626-a77f-c24f71c10f97" containerID="9d5eb8a46fbfa72cb97f79bc4f73e4e0ce2f43d84f4df31d2086bf65bc173fc4" exitCode=0 Nov 22 03:32:57 crc kubenswrapper[4952]: I1122 03:32:57.922049 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" event={"ID":"6908216d-a851-4626-a77f-c24f71c10f97","Type":"ContainerDied","Data":"9d5eb8a46fbfa72cb97f79bc4f73e4e0ce2f43d84f4df31d2086bf65bc173fc4"} Nov 22 03:32:59 crc kubenswrapper[4952]: I1122 03:32:59.358086 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" Nov 22 03:32:59 crc kubenswrapper[4952]: I1122 03:32:59.438298 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6908216d-a851-4626-a77f-c24f71c10f97-ssh-key\") pod \"6908216d-a851-4626-a77f-c24f71c10f97\" (UID: \"6908216d-a851-4626-a77f-c24f71c10f97\") " Nov 22 03:32:59 crc kubenswrapper[4952]: I1122 03:32:59.438472 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6908216d-a851-4626-a77f-c24f71c10f97-inventory\") pod \"6908216d-a851-4626-a77f-c24f71c10f97\" (UID: \"6908216d-a851-4626-a77f-c24f71c10f97\") " Nov 22 03:32:59 crc kubenswrapper[4952]: I1122 03:32:59.439154 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6908216d-a851-4626-a77f-c24f71c10f97-ceph\") pod \"6908216d-a851-4626-a77f-c24f71c10f97\" (UID: \"6908216d-a851-4626-a77f-c24f71c10f97\") " Nov 22 03:32:59 crc kubenswrapper[4952]: I1122 03:32:59.439283 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpw5g\" (UniqueName: \"kubernetes.io/projected/6908216d-a851-4626-a77f-c24f71c10f97-kube-api-access-fpw5g\") pod \"6908216d-a851-4626-a77f-c24f71c10f97\" (UID: \"6908216d-a851-4626-a77f-c24f71c10f97\") " Nov 22 03:32:59 crc kubenswrapper[4952]: I1122 03:32:59.444630 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6908216d-a851-4626-a77f-c24f71c10f97-ceph" (OuterVolumeSpecName: "ceph") pod "6908216d-a851-4626-a77f-c24f71c10f97" (UID: "6908216d-a851-4626-a77f-c24f71c10f97"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:59 crc kubenswrapper[4952]: I1122 03:32:59.444829 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6908216d-a851-4626-a77f-c24f71c10f97-kube-api-access-fpw5g" (OuterVolumeSpecName: "kube-api-access-fpw5g") pod "6908216d-a851-4626-a77f-c24f71c10f97" (UID: "6908216d-a851-4626-a77f-c24f71c10f97"). InnerVolumeSpecName "kube-api-access-fpw5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:32:59 crc kubenswrapper[4952]: I1122 03:32:59.471334 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6908216d-a851-4626-a77f-c24f71c10f97-inventory" (OuterVolumeSpecName: "inventory") pod "6908216d-a851-4626-a77f-c24f71c10f97" (UID: "6908216d-a851-4626-a77f-c24f71c10f97"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:59 crc kubenswrapper[4952]: I1122 03:32:59.491029 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6908216d-a851-4626-a77f-c24f71c10f97-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6908216d-a851-4626-a77f-c24f71c10f97" (UID: "6908216d-a851-4626-a77f-c24f71c10f97"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:59 crc kubenswrapper[4952]: I1122 03:32:59.542074 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6908216d-a851-4626-a77f-c24f71c10f97-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:59 crc kubenswrapper[4952]: I1122 03:32:59.542459 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6908216d-a851-4626-a77f-c24f71c10f97-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:59 crc kubenswrapper[4952]: I1122 03:32:59.542478 4952 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6908216d-a851-4626-a77f-c24f71c10f97-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:59 crc kubenswrapper[4952]: I1122 03:32:59.542494 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpw5g\" (UniqueName: \"kubernetes.io/projected/6908216d-a851-4626-a77f-c24f71c10f97-kube-api-access-fpw5g\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:59 crc kubenswrapper[4952]: I1122 03:32:59.950014 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" event={"ID":"6908216d-a851-4626-a77f-c24f71c10f97","Type":"ContainerDied","Data":"78ad5031f7b980977e97869a5c703936e515f1e37521b57bb59336afca6a8d52"} Nov 22 03:32:59 crc kubenswrapper[4952]: I1122 03:32:59.950075 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78ad5031f7b980977e97869a5c703936e515f1e37521b57bb59336afca6a8d52" Nov 22 03:32:59 crc kubenswrapper[4952]: I1122 03:32:59.950086 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.090650 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg"] Nov 22 03:33:00 crc kubenswrapper[4952]: E1122 03:33:00.091234 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6908216d-a851-4626-a77f-c24f71c10f97" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.091273 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="6908216d-a851-4626-a77f-c24f71c10f97" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.091611 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="6908216d-a851-4626-a77f-c24f71c10f97" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.092363 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.099444 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg"] Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.120812 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.121316 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.121676 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.121756 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.121967 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.122019 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.122364 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.122422 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.261240 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.261286 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.261433 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.261620 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.261755 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.261788 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.261929 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.261951 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.261987 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtbb2\" (UniqueName: \"kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-kube-api-access-mtbb2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.262007 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.262039 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.262094 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.262127 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.364008 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.364089 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.364133 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.364211 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.364272 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.364305 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.364390 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.364427 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.364487 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtbb2\" (UniqueName: \"kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-kube-api-access-mtbb2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.364525 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.364629 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.364700 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.364778 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.371620 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.371785 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.373602 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.375157 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.375268 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.375713 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.377967 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.379428 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.379768 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.380919 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.389812 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.390694 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.398210 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtbb2\" (UniqueName: \"kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-kube-api-access-mtbb2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-krzsg\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:00 crc kubenswrapper[4952]: I1122 03:33:00.433735 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:01 crc kubenswrapper[4952]: I1122 03:33:01.035043 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg"] Nov 22 03:33:01 crc kubenswrapper[4952]: I1122 03:33:01.975485 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" event={"ID":"70025cca-7c2a-4798-a3ab-5f58dd05033c","Type":"ContainerStarted","Data":"fc624a31a63f9d36f707e2f26bf7e2f41df86effb68805714031280158921e70"} Nov 22 03:33:01 crc kubenswrapper[4952]: I1122 03:33:01.975920 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" event={"ID":"70025cca-7c2a-4798-a3ab-5f58dd05033c","Type":"ContainerStarted","Data":"d77adb51899b4d343d1e7a14c7f1313e995541cdc58d43058dcc82705e1e9d2a"} Nov 22 03:33:02 crc kubenswrapper[4952]: I1122 03:33:02.002691 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" podStartSLOduration=1.5766340639999998 podStartE2EDuration="2.002669126s" podCreationTimestamp="2025-11-22 03:33:00 +0000 UTC" firstStartedPulling="2025-11-22 03:33:01.040661075 +0000 UTC m=+2345.346678358" lastFinishedPulling="2025-11-22 03:33:01.466696147 +0000 UTC m=+2345.772713420" observedRunningTime="2025-11-22 03:33:01.998841084 +0000 UTC m=+2346.304858357" watchObservedRunningTime="2025-11-22 03:33:02.002669126 +0000 UTC m=+2346.308686399" Nov 22 03:33:03 crc kubenswrapper[4952]: I1122 03:33:03.531762 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:33:03 crc kubenswrapper[4952]: E1122 03:33:03.532629 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:33:18 crc kubenswrapper[4952]: I1122 03:33:18.531175 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:33:18 crc kubenswrapper[4952]: E1122 03:33:18.532117 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:33:30 crc kubenswrapper[4952]: I1122 03:33:30.532484 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:33:30 crc kubenswrapper[4952]: E1122 03:33:30.533598 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:33:37 crc kubenswrapper[4952]: I1122 03:33:37.330632 4952 generic.go:334] "Generic (PLEG): container finished" podID="70025cca-7c2a-4798-a3ab-5f58dd05033c" containerID="fc624a31a63f9d36f707e2f26bf7e2f41df86effb68805714031280158921e70" exitCode=0 Nov 22 03:33:37 crc kubenswrapper[4952]: I1122 03:33:37.330745 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" event={"ID":"70025cca-7c2a-4798-a3ab-5f58dd05033c","Type":"ContainerDied","Data":"fc624a31a63f9d36f707e2f26bf7e2f41df86effb68805714031280158921e70"} Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.733309 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.917570 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-neutron-metadata-combined-ca-bundle\") pod \"70025cca-7c2a-4798-a3ab-5f58dd05033c\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.917619 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"70025cca-7c2a-4798-a3ab-5f58dd05033c\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.917640 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtbb2\" (UniqueName: \"kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-kube-api-access-mtbb2\") pod \"70025cca-7c2a-4798-a3ab-5f58dd05033c\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.917700 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-nova-combined-ca-bundle\") pod \"70025cca-7c2a-4798-a3ab-5f58dd05033c\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.917721 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-repo-setup-combined-ca-bundle\") pod \"70025cca-7c2a-4798-a3ab-5f58dd05033c\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.917785 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-bootstrap-combined-ca-bundle\") pod \"70025cca-7c2a-4798-a3ab-5f58dd05033c\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.917810 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-libvirt-combined-ca-bundle\") pod \"70025cca-7c2a-4798-a3ab-5f58dd05033c\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.917844 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"70025cca-7c2a-4798-a3ab-5f58dd05033c\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.917868 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-ovn-combined-ca-bundle\") pod \"70025cca-7c2a-4798-a3ab-5f58dd05033c\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.917902 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-inventory\") pod \"70025cca-7c2a-4798-a3ab-5f58dd05033c\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.917951 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-ssh-key\") pod \"70025cca-7c2a-4798-a3ab-5f58dd05033c\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.917985 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-ceph\") pod \"70025cca-7c2a-4798-a3ab-5f58dd05033c\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.918022 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"70025cca-7c2a-4798-a3ab-5f58dd05033c\" (UID: \"70025cca-7c2a-4798-a3ab-5f58dd05033c\") " Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.924410 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "70025cca-7c2a-4798-a3ab-5f58dd05033c" (UID: "70025cca-7c2a-4798-a3ab-5f58dd05033c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.924469 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "70025cca-7c2a-4798-a3ab-5f58dd05033c" (UID: "70025cca-7c2a-4798-a3ab-5f58dd05033c"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.925009 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "70025cca-7c2a-4798-a3ab-5f58dd05033c" (UID: "70025cca-7c2a-4798-a3ab-5f58dd05033c"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.925728 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "70025cca-7c2a-4798-a3ab-5f58dd05033c" (UID: "70025cca-7c2a-4798-a3ab-5f58dd05033c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.925891 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "70025cca-7c2a-4798-a3ab-5f58dd05033c" (UID: "70025cca-7c2a-4798-a3ab-5f58dd05033c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.926255 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-kube-api-access-mtbb2" (OuterVolumeSpecName: "kube-api-access-mtbb2") pod "70025cca-7c2a-4798-a3ab-5f58dd05033c" (UID: "70025cca-7c2a-4798-a3ab-5f58dd05033c"). InnerVolumeSpecName "kube-api-access-mtbb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.926256 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "70025cca-7c2a-4798-a3ab-5f58dd05033c" (UID: "70025cca-7c2a-4798-a3ab-5f58dd05033c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.926743 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "70025cca-7c2a-4798-a3ab-5f58dd05033c" (UID: "70025cca-7c2a-4798-a3ab-5f58dd05033c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.926786 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-ceph" (OuterVolumeSpecName: "ceph") pod "70025cca-7c2a-4798-a3ab-5f58dd05033c" (UID: "70025cca-7c2a-4798-a3ab-5f58dd05033c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.928686 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "70025cca-7c2a-4798-a3ab-5f58dd05033c" (UID: "70025cca-7c2a-4798-a3ab-5f58dd05033c"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.931206 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "70025cca-7c2a-4798-a3ab-5f58dd05033c" (UID: "70025cca-7c2a-4798-a3ab-5f58dd05033c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.949419 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "70025cca-7c2a-4798-a3ab-5f58dd05033c" (UID: "70025cca-7c2a-4798-a3ab-5f58dd05033c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:33:38 crc kubenswrapper[4952]: I1122 03:33:38.956113 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-inventory" (OuterVolumeSpecName: "inventory") pod "70025cca-7c2a-4798-a3ab-5f58dd05033c" (UID: "70025cca-7c2a-4798-a3ab-5f58dd05033c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.019687 4952 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.019888 4952 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.019973 4952 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.020056 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.020128 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.020199 4952 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.020287 4952 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.020370 4952 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.020443 4952 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.020516 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtbb2\" (UniqueName: \"kubernetes.io/projected/70025cca-7c2a-4798-a3ab-5f58dd05033c-kube-api-access-mtbb2\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.020627 4952 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.020713 4952 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.020794 4952 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70025cca-7c2a-4798-a3ab-5f58dd05033c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.349990 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" event={"ID":"70025cca-7c2a-4798-a3ab-5f58dd05033c","Type":"ContainerDied","Data":"d77adb51899b4d343d1e7a14c7f1313e995541cdc58d43058dcc82705e1e9d2a"} Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.350050 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d77adb51899b4d343d1e7a14c7f1313e995541cdc58d43058dcc82705e1e9d2a" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.350084 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-krzsg" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.466934 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52"] Nov 22 03:33:39 crc kubenswrapper[4952]: E1122 03:33:39.467534 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70025cca-7c2a-4798-a3ab-5f58dd05033c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.467587 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="70025cca-7c2a-4798-a3ab-5f58dd05033c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.467953 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="70025cca-7c2a-4798-a3ab-5f58dd05033c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.469011 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.473432 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52"] Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.477150 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.477676 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.477926 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.478152 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.478688 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.632724 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11e2e39c-3f90-448f-8438-fb38763a3c03-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52\" (UID: \"11e2e39c-3f90-448f-8438-fb38763a3c03\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.633495 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11e2e39c-3f90-448f-8438-fb38763a3c03-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52\" (UID: \"11e2e39c-3f90-448f-8438-fb38763a3c03\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.633604 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/11e2e39c-3f90-448f-8438-fb38763a3c03-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52\" (UID: \"11e2e39c-3f90-448f-8438-fb38763a3c03\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.633680 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6sh9\" (UniqueName: \"kubernetes.io/projected/11e2e39c-3f90-448f-8438-fb38763a3c03-kube-api-access-d6sh9\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52\" (UID: \"11e2e39c-3f90-448f-8438-fb38763a3c03\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.735468 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11e2e39c-3f90-448f-8438-fb38763a3c03-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52\" (UID: \"11e2e39c-3f90-448f-8438-fb38763a3c03\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.735528 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11e2e39c-3f90-448f-8438-fb38763a3c03-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52\" (UID: \"11e2e39c-3f90-448f-8438-fb38763a3c03\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.735586 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/11e2e39c-3f90-448f-8438-fb38763a3c03-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52\" (UID: \"11e2e39c-3f90-448f-8438-fb38763a3c03\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.735625 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6sh9\" (UniqueName: \"kubernetes.io/projected/11e2e39c-3f90-448f-8438-fb38763a3c03-kube-api-access-d6sh9\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52\" (UID: \"11e2e39c-3f90-448f-8438-fb38763a3c03\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.748967 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/11e2e39c-3f90-448f-8438-fb38763a3c03-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52\" (UID: \"11e2e39c-3f90-448f-8438-fb38763a3c03\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.749305 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11e2e39c-3f90-448f-8438-fb38763a3c03-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52\" (UID: \"11e2e39c-3f90-448f-8438-fb38763a3c03\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.749507 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11e2e39c-3f90-448f-8438-fb38763a3c03-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52\" (UID: \"11e2e39c-3f90-448f-8438-fb38763a3c03\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.754853 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6sh9\" (UniqueName: \"kubernetes.io/projected/11e2e39c-3f90-448f-8438-fb38763a3c03-kube-api-access-d6sh9\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52\" (UID: \"11e2e39c-3f90-448f-8438-fb38763a3c03\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" Nov 22 03:33:39 crc kubenswrapper[4952]: I1122 03:33:39.789226 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" Nov 22 03:33:40 crc kubenswrapper[4952]: I1122 03:33:40.379944 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52"] Nov 22 03:33:41 crc kubenswrapper[4952]: I1122 03:33:41.373737 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" event={"ID":"11e2e39c-3f90-448f-8438-fb38763a3c03","Type":"ContainerStarted","Data":"5477757ec281565120dca1bb43b6c940b6d9e1f5a43049c7ccea8498fef8af29"} Nov 22 03:33:41 crc kubenswrapper[4952]: I1122 03:33:41.374783 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" event={"ID":"11e2e39c-3f90-448f-8438-fb38763a3c03","Type":"ContainerStarted","Data":"811181f182d14cf4ae32419de58ce9f6e0a09884a4a8004672aeaa1e91ff1342"} Nov 22 03:33:41 crc kubenswrapper[4952]: I1122 03:33:41.401572 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" podStartSLOduration=1.6925473229999999 podStartE2EDuration="2.4015267s" podCreationTimestamp="2025-11-22 03:33:39 +0000 UTC" firstStartedPulling="2025-11-22 03:33:40.380082664 +0000 UTC m=+2384.686099937" lastFinishedPulling="2025-11-22 03:33:41.089061991 +0000 UTC m=+2385.395079314" observedRunningTime="2025-11-22 03:33:41.397170385 +0000 UTC m=+2385.703187678" watchObservedRunningTime="2025-11-22 03:33:41.4015267 +0000 UTC m=+2385.707543983" Nov 22 03:33:44 crc kubenswrapper[4952]: I1122 03:33:44.530887 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:33:44 crc kubenswrapper[4952]: E1122 03:33:44.531744 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:33:47 crc kubenswrapper[4952]: I1122 03:33:47.436908 4952 generic.go:334] "Generic (PLEG): container finished" podID="11e2e39c-3f90-448f-8438-fb38763a3c03" containerID="5477757ec281565120dca1bb43b6c940b6d9e1f5a43049c7ccea8498fef8af29" exitCode=0 Nov 22 03:33:47 crc kubenswrapper[4952]: I1122 03:33:47.437918 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" event={"ID":"11e2e39c-3f90-448f-8438-fb38763a3c03","Type":"ContainerDied","Data":"5477757ec281565120dca1bb43b6c940b6d9e1f5a43049c7ccea8498fef8af29"} Nov 22 03:33:48 crc kubenswrapper[4952]: I1122 03:33:48.870137 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.017741 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6sh9\" (UniqueName: \"kubernetes.io/projected/11e2e39c-3f90-448f-8438-fb38763a3c03-kube-api-access-d6sh9\") pod \"11e2e39c-3f90-448f-8438-fb38763a3c03\" (UID: \"11e2e39c-3f90-448f-8438-fb38763a3c03\") " Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.017888 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11e2e39c-3f90-448f-8438-fb38763a3c03-inventory\") pod \"11e2e39c-3f90-448f-8438-fb38763a3c03\" (UID: \"11e2e39c-3f90-448f-8438-fb38763a3c03\") " Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.017929 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11e2e39c-3f90-448f-8438-fb38763a3c03-ssh-key\") pod \"11e2e39c-3f90-448f-8438-fb38763a3c03\" (UID: \"11e2e39c-3f90-448f-8438-fb38763a3c03\") " Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.017945 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/11e2e39c-3f90-448f-8438-fb38763a3c03-ceph\") pod \"11e2e39c-3f90-448f-8438-fb38763a3c03\" (UID: \"11e2e39c-3f90-448f-8438-fb38763a3c03\") " Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.022593 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11e2e39c-3f90-448f-8438-fb38763a3c03-kube-api-access-d6sh9" (OuterVolumeSpecName: "kube-api-access-d6sh9") pod "11e2e39c-3f90-448f-8438-fb38763a3c03" (UID: "11e2e39c-3f90-448f-8438-fb38763a3c03"). InnerVolumeSpecName "kube-api-access-d6sh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.029461 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11e2e39c-3f90-448f-8438-fb38763a3c03-ceph" (OuterVolumeSpecName: "ceph") pod "11e2e39c-3f90-448f-8438-fb38763a3c03" (UID: "11e2e39c-3f90-448f-8438-fb38763a3c03"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.043045 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11e2e39c-3f90-448f-8438-fb38763a3c03-inventory" (OuterVolumeSpecName: "inventory") pod "11e2e39c-3f90-448f-8438-fb38763a3c03" (UID: "11e2e39c-3f90-448f-8438-fb38763a3c03"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.045318 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11e2e39c-3f90-448f-8438-fb38763a3c03-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "11e2e39c-3f90-448f-8438-fb38763a3c03" (UID: "11e2e39c-3f90-448f-8438-fb38763a3c03"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.119835 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11e2e39c-3f90-448f-8438-fb38763a3c03-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.119865 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11e2e39c-3f90-448f-8438-fb38763a3c03-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.119876 4952 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/11e2e39c-3f90-448f-8438-fb38763a3c03-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.119886 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6sh9\" (UniqueName: \"kubernetes.io/projected/11e2e39c-3f90-448f-8438-fb38763a3c03-kube-api-access-d6sh9\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.462897 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" event={"ID":"11e2e39c-3f90-448f-8438-fb38763a3c03","Type":"ContainerDied","Data":"811181f182d14cf4ae32419de58ce9f6e0a09884a4a8004672aeaa1e91ff1342"} Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.463427 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="811181f182d14cf4ae32419de58ce9f6e0a09884a4a8004672aeaa1e91ff1342" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.462965 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.569889 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6"] Nov 22 03:33:49 crc kubenswrapper[4952]: E1122 03:33:49.570431 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11e2e39c-3f90-448f-8438-fb38763a3c03" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.570449 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e2e39c-3f90-448f-8438-fb38763a3c03" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.570654 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="11e2e39c-3f90-448f-8438-fb38763a3c03" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.571436 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.575283 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.575642 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.575947 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.575975 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.576096 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.576105 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.586578 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6"] Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.732573 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fvbt6\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.732620 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fvbt6\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.732775 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fvbt6\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.732859 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fvbt6\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.733033 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plqjs\" (UniqueName: \"kubernetes.io/projected/99bc1860-8d74-4e04-ba87-35b5254e7a57-kube-api-access-plqjs\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fvbt6\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.733127 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/99bc1860-8d74-4e04-ba87-35b5254e7a57-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fvbt6\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.834537 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fvbt6\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.834631 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fvbt6\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.834669 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plqjs\" (UniqueName: \"kubernetes.io/projected/99bc1860-8d74-4e04-ba87-35b5254e7a57-kube-api-access-plqjs\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fvbt6\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.834696 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/99bc1860-8d74-4e04-ba87-35b5254e7a57-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fvbt6\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.834862 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fvbt6\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.834883 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fvbt6\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.836657 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/99bc1860-8d74-4e04-ba87-35b5254e7a57-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fvbt6\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.840483 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fvbt6\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.840601 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fvbt6\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.843234 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fvbt6\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.849134 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fvbt6\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.856330 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plqjs\" (UniqueName: \"kubernetes.io/projected/99bc1860-8d74-4e04-ba87-35b5254e7a57-kube-api-access-plqjs\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fvbt6\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:33:49 crc kubenswrapper[4952]: I1122 03:33:49.898999 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:33:50 crc kubenswrapper[4952]: I1122 03:33:50.524977 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6"] Nov 22 03:33:51 crc kubenswrapper[4952]: I1122 03:33:51.482308 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" event={"ID":"99bc1860-8d74-4e04-ba87-35b5254e7a57","Type":"ContainerStarted","Data":"86fc57d7916858a0c95d52eeaeeba7e738c0b684bd193c84391cfe3e36c64315"} Nov 22 03:33:51 crc kubenswrapper[4952]: I1122 03:33:51.482652 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" event={"ID":"99bc1860-8d74-4e04-ba87-35b5254e7a57","Type":"ContainerStarted","Data":"95e76610675a17720d98a72593c63f9d58dc41ee6dd37555b5559cc48d0a19fa"} Nov 22 03:33:51 crc kubenswrapper[4952]: I1122 03:33:51.503534 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" podStartSLOduration=1.971992172 podStartE2EDuration="2.50350521s" podCreationTimestamp="2025-11-22 03:33:49 +0000 UTC" firstStartedPulling="2025-11-22 03:33:50.533559582 +0000 UTC m=+2394.839576865" lastFinishedPulling="2025-11-22 03:33:51.06507255 +0000 UTC m=+2395.371089903" observedRunningTime="2025-11-22 03:33:51.500198233 +0000 UTC m=+2395.806215506" watchObservedRunningTime="2025-11-22 03:33:51.50350521 +0000 UTC m=+2395.809522493" Nov 22 03:33:58 crc kubenswrapper[4952]: I1122 03:33:58.531289 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:33:58 crc kubenswrapper[4952]: E1122 03:33:58.532273 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:34:13 crc kubenswrapper[4952]: I1122 03:34:13.531304 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:34:13 crc kubenswrapper[4952]: E1122 03:34:13.532053 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:34:25 crc kubenswrapper[4952]: I1122 03:34:25.531866 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:34:25 crc kubenswrapper[4952]: E1122 03:34:25.533152 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:34:38 crc kubenswrapper[4952]: I1122 03:34:38.532837 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:34:38 crc kubenswrapper[4952]: E1122 03:34:38.534491 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:34:53 crc kubenswrapper[4952]: I1122 03:34:53.531086 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:34:53 crc kubenswrapper[4952]: E1122 03:34:53.531957 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:35:02 crc kubenswrapper[4952]: I1122 03:35:02.220617 4952 scope.go:117] "RemoveContainer" containerID="97dfc2298e4da1b346e0ad5668766465d85e40a699d3cc46259c866f4982f108" Nov 22 03:35:02 crc kubenswrapper[4952]: I1122 03:35:02.244426 4952 scope.go:117] "RemoveContainer" containerID="6d09a40b7504001c0a680e83c8457ef8e5a1e61b4b74a3b33eed798d9ad7effc" Nov 22 03:35:02 crc kubenswrapper[4952]: I1122 03:35:02.311808 4952 scope.go:117] "RemoveContainer" containerID="efc69d38f0c760cfba75452b1b2089eeed18d76f903022ebf8cd162da6f8a65b" Nov 22 03:35:05 crc kubenswrapper[4952]: I1122 03:35:05.531706 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:35:05 crc kubenswrapper[4952]: E1122 03:35:05.532719 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:35:13 crc kubenswrapper[4952]: I1122 03:35:13.326410 4952 generic.go:334] "Generic (PLEG): container finished" podID="99bc1860-8d74-4e04-ba87-35b5254e7a57" containerID="86fc57d7916858a0c95d52eeaeeba7e738c0b684bd193c84391cfe3e36c64315" exitCode=0 Nov 22 03:35:13 crc kubenswrapper[4952]: I1122 03:35:13.326492 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" event={"ID":"99bc1860-8d74-4e04-ba87-35b5254e7a57","Type":"ContainerDied","Data":"86fc57d7916858a0c95d52eeaeeba7e738c0b684bd193c84391cfe3e36c64315"} Nov 22 03:35:14 crc kubenswrapper[4952]: I1122 03:35:14.762921 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:35:14 crc kubenswrapper[4952]: I1122 03:35:14.881372 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plqjs\" (UniqueName: \"kubernetes.io/projected/99bc1860-8d74-4e04-ba87-35b5254e7a57-kube-api-access-plqjs\") pod \"99bc1860-8d74-4e04-ba87-35b5254e7a57\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " Nov 22 03:35:14 crc kubenswrapper[4952]: I1122 03:35:14.881491 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-ovn-combined-ca-bundle\") pod \"99bc1860-8d74-4e04-ba87-35b5254e7a57\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " Nov 22 03:35:14 crc kubenswrapper[4952]: I1122 03:35:14.881594 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/99bc1860-8d74-4e04-ba87-35b5254e7a57-ovncontroller-config-0\") pod \"99bc1860-8d74-4e04-ba87-35b5254e7a57\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " Nov 22 03:35:14 crc kubenswrapper[4952]: I1122 03:35:14.881618 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-ceph\") pod \"99bc1860-8d74-4e04-ba87-35b5254e7a57\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " Nov 22 03:35:14 crc kubenswrapper[4952]: I1122 03:35:14.881649 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-inventory\") pod \"99bc1860-8d74-4e04-ba87-35b5254e7a57\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " Nov 22 03:35:14 crc kubenswrapper[4952]: I1122 03:35:14.881709 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-ssh-key\") pod \"99bc1860-8d74-4e04-ba87-35b5254e7a57\" (UID: \"99bc1860-8d74-4e04-ba87-35b5254e7a57\") " Nov 22 03:35:14 crc kubenswrapper[4952]: I1122 03:35:14.891574 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "99bc1860-8d74-4e04-ba87-35b5254e7a57" (UID: "99bc1860-8d74-4e04-ba87-35b5254e7a57"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:35:14 crc kubenswrapper[4952]: I1122 03:35:14.894229 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-ceph" (OuterVolumeSpecName: "ceph") pod "99bc1860-8d74-4e04-ba87-35b5254e7a57" (UID: "99bc1860-8d74-4e04-ba87-35b5254e7a57"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:35:14 crc kubenswrapper[4952]: I1122 03:35:14.894759 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99bc1860-8d74-4e04-ba87-35b5254e7a57-kube-api-access-plqjs" (OuterVolumeSpecName: "kube-api-access-plqjs") pod "99bc1860-8d74-4e04-ba87-35b5254e7a57" (UID: "99bc1860-8d74-4e04-ba87-35b5254e7a57"). InnerVolumeSpecName "kube-api-access-plqjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:35:14 crc kubenswrapper[4952]: I1122 03:35:14.909243 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99bc1860-8d74-4e04-ba87-35b5254e7a57-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "99bc1860-8d74-4e04-ba87-35b5254e7a57" (UID: "99bc1860-8d74-4e04-ba87-35b5254e7a57"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:35:14 crc kubenswrapper[4952]: I1122 03:35:14.911915 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-inventory" (OuterVolumeSpecName: "inventory") pod "99bc1860-8d74-4e04-ba87-35b5254e7a57" (UID: "99bc1860-8d74-4e04-ba87-35b5254e7a57"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:35:14 crc kubenswrapper[4952]: I1122 03:35:14.920130 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "99bc1860-8d74-4e04-ba87-35b5254e7a57" (UID: "99bc1860-8d74-4e04-ba87-35b5254e7a57"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:35:14 crc kubenswrapper[4952]: I1122 03:35:14.984290 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:35:14 crc kubenswrapper[4952]: I1122 03:35:14.984330 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:35:14 crc kubenswrapper[4952]: I1122 03:35:14.984344 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plqjs\" (UniqueName: \"kubernetes.io/projected/99bc1860-8d74-4e04-ba87-35b5254e7a57-kube-api-access-plqjs\") on node \"crc\" DevicePath \"\"" Nov 22 03:35:14 crc kubenswrapper[4952]: I1122 03:35:14.984356 4952 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:35:14 crc kubenswrapper[4952]: I1122 03:35:14.984371 4952 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/99bc1860-8d74-4e04-ba87-35b5254e7a57-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:35:14 crc kubenswrapper[4952]: I1122 03:35:14.984383 4952 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/99bc1860-8d74-4e04-ba87-35b5254e7a57-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.345891 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" event={"ID":"99bc1860-8d74-4e04-ba87-35b5254e7a57","Type":"ContainerDied","Data":"95e76610675a17720d98a72593c63f9d58dc41ee6dd37555b5559cc48d0a19fa"} Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.345934 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95e76610675a17720d98a72593c63f9d58dc41ee6dd37555b5559cc48d0a19fa" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.345993 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fvbt6" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.482222 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5"] Nov 22 03:35:15 crc kubenswrapper[4952]: E1122 03:35:15.482828 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bc1860-8d74-4e04-ba87-35b5254e7a57" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.482857 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bc1860-8d74-4e04-ba87-35b5254e7a57" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.483124 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="99bc1860-8d74-4e04-ba87-35b5254e7a57" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.485146 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.489174 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.489463 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.490346 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.490570 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.494995 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.495182 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.495291 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.507015 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5"] Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.598561 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.598636 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.598927 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.599151 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nhfc\" (UniqueName: \"kubernetes.io/projected/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-kube-api-access-6nhfc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.599192 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.599388 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.599613 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.701088 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.701168 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.701233 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.701274 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.701310 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.701346 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nhfc\" (UniqueName: \"kubernetes.io/projected/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-kube-api-access-6nhfc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.701364 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.722646 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.724716 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.724868 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.734143 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.734498 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.747132 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.761337 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nhfc\" (UniqueName: \"kubernetes.io/projected/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-kube-api-access-6nhfc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:15 crc kubenswrapper[4952]: I1122 03:35:15.822108 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:35:16 crc kubenswrapper[4952]: I1122 03:35:16.253310 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5"] Nov 22 03:35:16 crc kubenswrapper[4952]: I1122 03:35:16.354533 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" event={"ID":"fed80aaa-96fc-489e-b8fa-b25d9dff2f51","Type":"ContainerStarted","Data":"1e63c326d3db15b9343f01dd950eda8b3b9edb28043e30fe59589af5eb9d0aac"} Nov 22 03:35:17 crc kubenswrapper[4952]: I1122 03:35:17.367599 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" event={"ID":"fed80aaa-96fc-489e-b8fa-b25d9dff2f51","Type":"ContainerStarted","Data":"0904b9b75718710f7adde8d924d6abec842a3ebd197f49cb443aabb8ca6d43ad"} Nov 22 03:35:17 crc kubenswrapper[4952]: I1122 03:35:17.409227 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" podStartSLOduration=1.972891895 podStartE2EDuration="2.409198438s" podCreationTimestamp="2025-11-22 03:35:15 +0000 UTC" firstStartedPulling="2025-11-22 03:35:16.256278202 +0000 UTC m=+2480.562295475" lastFinishedPulling="2025-11-22 03:35:16.692584745 +0000 UTC m=+2480.998602018" observedRunningTime="2025-11-22 03:35:17.397419167 +0000 UTC m=+2481.703436450" watchObservedRunningTime="2025-11-22 03:35:17.409198438 +0000 UTC m=+2481.715215751" Nov 22 03:35:17 crc kubenswrapper[4952]: I1122 03:35:17.531685 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:35:17 crc kubenswrapper[4952]: E1122 03:35:17.532457 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:35:32 crc kubenswrapper[4952]: I1122 03:35:32.531851 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:35:33 crc kubenswrapper[4952]: I1122 03:35:33.532946 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"707d2fd1ef1cff3aa64137fd32f9e85383520dcc99249ecbe7832a34839ab912"} Nov 22 03:36:21 crc kubenswrapper[4952]: I1122 03:36:21.005210 4952 generic.go:334] "Generic (PLEG): container finished" podID="fed80aaa-96fc-489e-b8fa-b25d9dff2f51" containerID="0904b9b75718710f7adde8d924d6abec842a3ebd197f49cb443aabb8ca6d43ad" exitCode=0 Nov 22 03:36:21 crc kubenswrapper[4952]: I1122 03:36:21.005271 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" event={"ID":"fed80aaa-96fc-489e-b8fa-b25d9dff2f51","Type":"ContainerDied","Data":"0904b9b75718710f7adde8d924d6abec842a3ebd197f49cb443aabb8ca6d43ad"} Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.494010 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.622952 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-neutron-metadata-combined-ca-bundle\") pod \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.623076 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-neutron-ovn-metadata-agent-neutron-config-0\") pod \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.623185 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nhfc\" (UniqueName: \"kubernetes.io/projected/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-kube-api-access-6nhfc\") pod \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.623522 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-inventory\") pod \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.623572 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-ceph\") pod \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.623654 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-nova-metadata-neutron-config-0\") pod \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.623782 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-ssh-key\") pod \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\" (UID: \"fed80aaa-96fc-489e-b8fa-b25d9dff2f51\") " Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.630224 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "fed80aaa-96fc-489e-b8fa-b25d9dff2f51" (UID: "fed80aaa-96fc-489e-b8fa-b25d9dff2f51"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.645909 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-ceph" (OuterVolumeSpecName: "ceph") pod "fed80aaa-96fc-489e-b8fa-b25d9dff2f51" (UID: "fed80aaa-96fc-489e-b8fa-b25d9dff2f51"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.646116 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-kube-api-access-6nhfc" (OuterVolumeSpecName: "kube-api-access-6nhfc") pod "fed80aaa-96fc-489e-b8fa-b25d9dff2f51" (UID: "fed80aaa-96fc-489e-b8fa-b25d9dff2f51"). InnerVolumeSpecName "kube-api-access-6nhfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.653909 4952 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.653950 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nhfc\" (UniqueName: \"kubernetes.io/projected/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-kube-api-access-6nhfc\") on node \"crc\" DevicePath \"\"" Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.653967 4952 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.657341 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-inventory" (OuterVolumeSpecName: "inventory") pod "fed80aaa-96fc-489e-b8fa-b25d9dff2f51" (UID: "fed80aaa-96fc-489e-b8fa-b25d9dff2f51"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.660274 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "fed80aaa-96fc-489e-b8fa-b25d9dff2f51" (UID: "fed80aaa-96fc-489e-b8fa-b25d9dff2f51"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.677990 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fed80aaa-96fc-489e-b8fa-b25d9dff2f51" (UID: "fed80aaa-96fc-489e-b8fa-b25d9dff2f51"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.678074 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "fed80aaa-96fc-489e-b8fa-b25d9dff2f51" (UID: "fed80aaa-96fc-489e-b8fa-b25d9dff2f51"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.756042 4952 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.756114 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.756131 4952 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:36:22 crc kubenswrapper[4952]: I1122 03:36:22.756146 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fed80aaa-96fc-489e-b8fa-b25d9dff2f51-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.031328 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" event={"ID":"fed80aaa-96fc-489e-b8fa-b25d9dff2f51","Type":"ContainerDied","Data":"1e63c326d3db15b9343f01dd950eda8b3b9edb28043e30fe59589af5eb9d0aac"} Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.031389 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e63c326d3db15b9343f01dd950eda8b3b9edb28043e30fe59589af5eb9d0aac" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.031407 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.137845 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c"] Nov 22 03:36:23 crc kubenswrapper[4952]: E1122 03:36:23.138385 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed80aaa-96fc-489e-b8fa-b25d9dff2f51" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.138412 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed80aaa-96fc-489e-b8fa-b25d9dff2f51" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.138694 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="fed80aaa-96fc-489e-b8fa-b25d9dff2f51" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.139600 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.144991 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.145014 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.145065 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.145209 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.145269 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.145908 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.153725 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c"] Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.269014 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.269279 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.269403 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.269494 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.269620 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.269739 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vsvb\" (UniqueName: \"kubernetes.io/projected/4f2a731f-8431-4769-a78e-6522954dd7b5-kube-api-access-2vsvb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.371130 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vsvb\" (UniqueName: \"kubernetes.io/projected/4f2a731f-8431-4769-a78e-6522954dd7b5-kube-api-access-2vsvb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.371471 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.371703 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.372821 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.372859 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.372893 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.376515 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.376859 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.378519 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.379513 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.384068 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.387525 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vsvb\" (UniqueName: \"kubernetes.io/projected/4f2a731f-8431-4769-a78e-6522954dd7b5-kube-api-access-2vsvb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:36:23 crc kubenswrapper[4952]: I1122 03:36:23.459608 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:36:24 crc kubenswrapper[4952]: I1122 03:36:24.019494 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c"] Nov 22 03:36:24 crc kubenswrapper[4952]: I1122 03:36:24.043961 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" event={"ID":"4f2a731f-8431-4769-a78e-6522954dd7b5","Type":"ContainerStarted","Data":"b776853b7699a7d16d7bacde9aed438254976007772fa43b4bbe6a5d0b417f6f"} Nov 22 03:36:25 crc kubenswrapper[4952]: I1122 03:36:25.058469 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" event={"ID":"4f2a731f-8431-4769-a78e-6522954dd7b5","Type":"ContainerStarted","Data":"66bffba76a6b16a0856259726bbbeb65bfae3dd5bb00910ccd2e57df206c8d5e"} Nov 22 03:36:25 crc kubenswrapper[4952]: I1122 03:36:25.079703 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" podStartSLOduration=1.615951653 podStartE2EDuration="2.079682373s" podCreationTimestamp="2025-11-22 03:36:23 +0000 UTC" firstStartedPulling="2025-11-22 03:36:24.031860362 +0000 UTC m=+2548.337877635" lastFinishedPulling="2025-11-22 03:36:24.495591092 +0000 UTC m=+2548.801608355" observedRunningTime="2025-11-22 03:36:25.076830957 +0000 UTC m=+2549.382848240" watchObservedRunningTime="2025-11-22 03:36:25.079682373 +0000 UTC m=+2549.385699646" Nov 22 03:37:58 crc kubenswrapper[4952]: I1122 03:37:58.341970 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:37:58 crc kubenswrapper[4952]: I1122 03:37:58.342437 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:38:28 crc kubenswrapper[4952]: I1122 03:38:28.342361 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:38:28 crc kubenswrapper[4952]: I1122 03:38:28.343377 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:38:58 crc kubenswrapper[4952]: I1122 03:38:58.341525 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:38:58 crc kubenswrapper[4952]: I1122 03:38:58.342217 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:38:58 crc kubenswrapper[4952]: I1122 03:38:58.342261 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 03:38:58 crc kubenswrapper[4952]: I1122 03:38:58.342963 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"707d2fd1ef1cff3aa64137fd32f9e85383520dcc99249ecbe7832a34839ab912"} pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:38:58 crc kubenswrapper[4952]: I1122 03:38:58.343019 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" containerID="cri-o://707d2fd1ef1cff3aa64137fd32f9e85383520dcc99249ecbe7832a34839ab912" gracePeriod=600 Nov 22 03:38:58 crc kubenswrapper[4952]: I1122 03:38:58.536591 4952 generic.go:334] "Generic (PLEG): container finished" podID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerID="707d2fd1ef1cff3aa64137fd32f9e85383520dcc99249ecbe7832a34839ab912" exitCode=0 Nov 22 03:38:58 crc kubenswrapper[4952]: I1122 03:38:58.543920 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerDied","Data":"707d2fd1ef1cff3aa64137fd32f9e85383520dcc99249ecbe7832a34839ab912"} Nov 22 03:38:58 crc kubenswrapper[4952]: I1122 03:38:58.543983 4952 scope.go:117] "RemoveContainer" containerID="25ad21a4cc928ab284b129602369cf4d26367bcb59dc9d6460417ce3c5d33b8b" Nov 22 03:38:59 crc kubenswrapper[4952]: I1122 03:38:59.553077 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d"} Nov 22 03:39:19 crc kubenswrapper[4952]: I1122 03:39:19.104231 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-48r6j"] Nov 22 03:39:19 crc kubenswrapper[4952]: I1122 03:39:19.108476 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48r6j" Nov 22 03:39:19 crc kubenswrapper[4952]: I1122 03:39:19.118488 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-48r6j"] Nov 22 03:39:19 crc kubenswrapper[4952]: I1122 03:39:19.196490 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2964d345-d05f-478d-afa8-774d76d40e94-utilities\") pod \"redhat-marketplace-48r6j\" (UID: \"2964d345-d05f-478d-afa8-774d76d40e94\") " pod="openshift-marketplace/redhat-marketplace-48r6j" Nov 22 03:39:19 crc kubenswrapper[4952]: I1122 03:39:19.196622 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2964d345-d05f-478d-afa8-774d76d40e94-catalog-content\") pod \"redhat-marketplace-48r6j\" (UID: \"2964d345-d05f-478d-afa8-774d76d40e94\") " pod="openshift-marketplace/redhat-marketplace-48r6j" Nov 22 03:39:19 crc kubenswrapper[4952]: I1122 03:39:19.196871 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf6ml\" (UniqueName: \"kubernetes.io/projected/2964d345-d05f-478d-afa8-774d76d40e94-kube-api-access-xf6ml\") pod \"redhat-marketplace-48r6j\" (UID: \"2964d345-d05f-478d-afa8-774d76d40e94\") " pod="openshift-marketplace/redhat-marketplace-48r6j" Nov 22 03:39:19 crc kubenswrapper[4952]: I1122 03:39:19.298642 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2964d345-d05f-478d-afa8-774d76d40e94-utilities\") pod \"redhat-marketplace-48r6j\" (UID: \"2964d345-d05f-478d-afa8-774d76d40e94\") " pod="openshift-marketplace/redhat-marketplace-48r6j" Nov 22 03:39:19 crc kubenswrapper[4952]: I1122 03:39:19.298718 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2964d345-d05f-478d-afa8-774d76d40e94-catalog-content\") pod \"redhat-marketplace-48r6j\" (UID: \"2964d345-d05f-478d-afa8-774d76d40e94\") " pod="openshift-marketplace/redhat-marketplace-48r6j" Nov 22 03:39:19 crc kubenswrapper[4952]: I1122 03:39:19.298779 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf6ml\" (UniqueName: \"kubernetes.io/projected/2964d345-d05f-478d-afa8-774d76d40e94-kube-api-access-xf6ml\") pod \"redhat-marketplace-48r6j\" (UID: \"2964d345-d05f-478d-afa8-774d76d40e94\") " pod="openshift-marketplace/redhat-marketplace-48r6j" Nov 22 03:39:19 crc kubenswrapper[4952]: I1122 03:39:19.299486 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2964d345-d05f-478d-afa8-774d76d40e94-utilities\") pod \"redhat-marketplace-48r6j\" (UID: \"2964d345-d05f-478d-afa8-774d76d40e94\") " pod="openshift-marketplace/redhat-marketplace-48r6j" Nov 22 03:39:19 crc kubenswrapper[4952]: I1122 03:39:19.299739 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2964d345-d05f-478d-afa8-774d76d40e94-catalog-content\") pod \"redhat-marketplace-48r6j\" (UID: \"2964d345-d05f-478d-afa8-774d76d40e94\") " pod="openshift-marketplace/redhat-marketplace-48r6j" Nov 22 03:39:19 crc kubenswrapper[4952]: I1122 03:39:19.324277 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf6ml\" (UniqueName: \"kubernetes.io/projected/2964d345-d05f-478d-afa8-774d76d40e94-kube-api-access-xf6ml\") pod \"redhat-marketplace-48r6j\" (UID: \"2964d345-d05f-478d-afa8-774d76d40e94\") " pod="openshift-marketplace/redhat-marketplace-48r6j" Nov 22 03:39:19 crc kubenswrapper[4952]: I1122 03:39:19.433648 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48r6j" Nov 22 03:39:19 crc kubenswrapper[4952]: I1122 03:39:19.972138 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-48r6j"] Nov 22 03:39:20 crc kubenswrapper[4952]: I1122 03:39:20.812938 4952 generic.go:334] "Generic (PLEG): container finished" podID="2964d345-d05f-478d-afa8-774d76d40e94" containerID="c577014d85f14dbb0a85b859b853422f4e9b14a8060f79660b9e00e7163914f2" exitCode=0 Nov 22 03:39:20 crc kubenswrapper[4952]: I1122 03:39:20.812993 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48r6j" event={"ID":"2964d345-d05f-478d-afa8-774d76d40e94","Type":"ContainerDied","Data":"c577014d85f14dbb0a85b859b853422f4e9b14a8060f79660b9e00e7163914f2"} Nov 22 03:39:20 crc kubenswrapper[4952]: I1122 03:39:20.813027 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48r6j" event={"ID":"2964d345-d05f-478d-afa8-774d76d40e94","Type":"ContainerStarted","Data":"7506d039af2a648464215f03be3dec51d02ffcbc2bfedda2293a1b490a54ea58"} Nov 22 03:39:20 crc kubenswrapper[4952]: I1122 03:39:20.815614 4952 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 03:39:21 crc kubenswrapper[4952]: I1122 03:39:21.826486 4952 generic.go:334] "Generic (PLEG): container finished" podID="2964d345-d05f-478d-afa8-774d76d40e94" containerID="250caac44c94dd5b5f099e53332e93bbdaa2f1de437bb11aad23401eb49d3573" exitCode=0 Nov 22 03:39:21 crc kubenswrapper[4952]: I1122 03:39:21.826596 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48r6j" event={"ID":"2964d345-d05f-478d-afa8-774d76d40e94","Type":"ContainerDied","Data":"250caac44c94dd5b5f099e53332e93bbdaa2f1de437bb11aad23401eb49d3573"} Nov 22 03:39:22 crc kubenswrapper[4952]: I1122 03:39:22.838868 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48r6j" event={"ID":"2964d345-d05f-478d-afa8-774d76d40e94","Type":"ContainerStarted","Data":"05dd793f442d745dcbbffddb65544369dcffb7c43c1a710bde924853ac131959"} Nov 22 03:39:22 crc kubenswrapper[4952]: I1122 03:39:22.867058 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-48r6j" podStartSLOduration=2.441664818 podStartE2EDuration="3.867038085s" podCreationTimestamp="2025-11-22 03:39:19 +0000 UTC" firstStartedPulling="2025-11-22 03:39:20.815220165 +0000 UTC m=+2725.121237448" lastFinishedPulling="2025-11-22 03:39:22.240593432 +0000 UTC m=+2726.546610715" observedRunningTime="2025-11-22 03:39:22.861210031 +0000 UTC m=+2727.167227344" watchObservedRunningTime="2025-11-22 03:39:22.867038085 +0000 UTC m=+2727.173055368" Nov 22 03:39:25 crc kubenswrapper[4952]: I1122 03:39:25.474889 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gwmp9"] Nov 22 03:39:25 crc kubenswrapper[4952]: I1122 03:39:25.478139 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwmp9" Nov 22 03:39:25 crc kubenswrapper[4952]: I1122 03:39:25.505177 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gwmp9"] Nov 22 03:39:25 crc kubenswrapper[4952]: I1122 03:39:25.633677 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef60d1be-3222-4eba-88f7-d35618db2083-utilities\") pod \"certified-operators-gwmp9\" (UID: \"ef60d1be-3222-4eba-88f7-d35618db2083\") " pod="openshift-marketplace/certified-operators-gwmp9" Nov 22 03:39:25 crc kubenswrapper[4952]: I1122 03:39:25.633737 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f47rz\" (UniqueName: \"kubernetes.io/projected/ef60d1be-3222-4eba-88f7-d35618db2083-kube-api-access-f47rz\") pod \"certified-operators-gwmp9\" (UID: \"ef60d1be-3222-4eba-88f7-d35618db2083\") " pod="openshift-marketplace/certified-operators-gwmp9" Nov 22 03:39:25 crc kubenswrapper[4952]: I1122 03:39:25.634126 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef60d1be-3222-4eba-88f7-d35618db2083-catalog-content\") pod \"certified-operators-gwmp9\" (UID: \"ef60d1be-3222-4eba-88f7-d35618db2083\") " pod="openshift-marketplace/certified-operators-gwmp9" Nov 22 03:39:25 crc kubenswrapper[4952]: I1122 03:39:25.735664 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef60d1be-3222-4eba-88f7-d35618db2083-catalog-content\") pod \"certified-operators-gwmp9\" (UID: \"ef60d1be-3222-4eba-88f7-d35618db2083\") " pod="openshift-marketplace/certified-operators-gwmp9" Nov 22 03:39:25 crc kubenswrapper[4952]: I1122 03:39:25.735790 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef60d1be-3222-4eba-88f7-d35618db2083-utilities\") pod \"certified-operators-gwmp9\" (UID: \"ef60d1be-3222-4eba-88f7-d35618db2083\") " pod="openshift-marketplace/certified-operators-gwmp9" Nov 22 03:39:25 crc kubenswrapper[4952]: I1122 03:39:25.735824 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f47rz\" (UniqueName: \"kubernetes.io/projected/ef60d1be-3222-4eba-88f7-d35618db2083-kube-api-access-f47rz\") pod \"certified-operators-gwmp9\" (UID: \"ef60d1be-3222-4eba-88f7-d35618db2083\") " pod="openshift-marketplace/certified-operators-gwmp9" Nov 22 03:39:25 crc kubenswrapper[4952]: I1122 03:39:25.736103 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef60d1be-3222-4eba-88f7-d35618db2083-catalog-content\") pod \"certified-operators-gwmp9\" (UID: \"ef60d1be-3222-4eba-88f7-d35618db2083\") " pod="openshift-marketplace/certified-operators-gwmp9" Nov 22 03:39:25 crc kubenswrapper[4952]: I1122 03:39:25.736390 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef60d1be-3222-4eba-88f7-d35618db2083-utilities\") pod \"certified-operators-gwmp9\" (UID: \"ef60d1be-3222-4eba-88f7-d35618db2083\") " pod="openshift-marketplace/certified-operators-gwmp9" Nov 22 03:39:25 crc kubenswrapper[4952]: I1122 03:39:25.761341 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f47rz\" (UniqueName: \"kubernetes.io/projected/ef60d1be-3222-4eba-88f7-d35618db2083-kube-api-access-f47rz\") pod \"certified-operators-gwmp9\" (UID: \"ef60d1be-3222-4eba-88f7-d35618db2083\") " pod="openshift-marketplace/certified-operators-gwmp9" Nov 22 03:39:25 crc kubenswrapper[4952]: I1122 03:39:25.807823 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwmp9" Nov 22 03:39:26 crc kubenswrapper[4952]: I1122 03:39:26.283857 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gwmp9"] Nov 22 03:39:26 crc kubenswrapper[4952]: W1122 03:39:26.290739 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef60d1be_3222_4eba_88f7_d35618db2083.slice/crio-67ad9b33b8bb2f9bd7be4279d02f38e4caae0343bd95fe0be7d7978d95e4e0f4 WatchSource:0}: Error finding container 67ad9b33b8bb2f9bd7be4279d02f38e4caae0343bd95fe0be7d7978d95e4e0f4: Status 404 returned error can't find the container with id 67ad9b33b8bb2f9bd7be4279d02f38e4caae0343bd95fe0be7d7978d95e4e0f4 Nov 22 03:39:26 crc kubenswrapper[4952]: I1122 03:39:26.881271 4952 generic.go:334] "Generic (PLEG): container finished" podID="ef60d1be-3222-4eba-88f7-d35618db2083" containerID="987cf9ff5936d9ae291d4a1dda4fb560003019aca4c2defc06be572b237f7164" exitCode=0 Nov 22 03:39:26 crc kubenswrapper[4952]: I1122 03:39:26.881329 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwmp9" event={"ID":"ef60d1be-3222-4eba-88f7-d35618db2083","Type":"ContainerDied","Data":"987cf9ff5936d9ae291d4a1dda4fb560003019aca4c2defc06be572b237f7164"} Nov 22 03:39:26 crc kubenswrapper[4952]: I1122 03:39:26.883701 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwmp9" event={"ID":"ef60d1be-3222-4eba-88f7-d35618db2083","Type":"ContainerStarted","Data":"67ad9b33b8bb2f9bd7be4279d02f38e4caae0343bd95fe0be7d7978d95e4e0f4"} Nov 22 03:39:27 crc kubenswrapper[4952]: I1122 03:39:27.905261 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwmp9" event={"ID":"ef60d1be-3222-4eba-88f7-d35618db2083","Type":"ContainerStarted","Data":"80a1374ac4517cb373bdc285c345888d03797c5e9194de12fd93fbf98bf075ac"} Nov 22 03:39:28 crc kubenswrapper[4952]: E1122 03:39:28.083402 4952 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef60d1be_3222_4eba_88f7_d35618db2083.slice/crio-80a1374ac4517cb373bdc285c345888d03797c5e9194de12fd93fbf98bf075ac.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef60d1be_3222_4eba_88f7_d35618db2083.slice/crio-conmon-80a1374ac4517cb373bdc285c345888d03797c5e9194de12fd93fbf98bf075ac.scope\": RecentStats: unable to find data in memory cache]" Nov 22 03:39:28 crc kubenswrapper[4952]: I1122 03:39:28.917197 4952 generic.go:334] "Generic (PLEG): container finished" podID="ef60d1be-3222-4eba-88f7-d35618db2083" containerID="80a1374ac4517cb373bdc285c345888d03797c5e9194de12fd93fbf98bf075ac" exitCode=0 Nov 22 03:39:28 crc kubenswrapper[4952]: I1122 03:39:28.917260 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwmp9" event={"ID":"ef60d1be-3222-4eba-88f7-d35618db2083","Type":"ContainerDied","Data":"80a1374ac4517cb373bdc285c345888d03797c5e9194de12fd93fbf98bf075ac"} Nov 22 03:39:28 crc kubenswrapper[4952]: I1122 03:39:28.917591 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwmp9" event={"ID":"ef60d1be-3222-4eba-88f7-d35618db2083","Type":"ContainerStarted","Data":"f14e972f2f7ea373c4801243085fa011bee7bad7d1f860b3e192546241494e44"} Nov 22 03:39:28 crc kubenswrapper[4952]: I1122 03:39:28.945178 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gwmp9" podStartSLOduration=2.494082861 podStartE2EDuration="3.945161271s" podCreationTimestamp="2025-11-22 03:39:25 +0000 UTC" firstStartedPulling="2025-11-22 03:39:26.885720857 +0000 UTC m=+2731.191738140" lastFinishedPulling="2025-11-22 03:39:28.336799257 +0000 UTC m=+2732.642816550" observedRunningTime="2025-11-22 03:39:28.942628893 +0000 UTC m=+2733.248646166" watchObservedRunningTime="2025-11-22 03:39:28.945161271 +0000 UTC m=+2733.251178534" Nov 22 03:39:29 crc kubenswrapper[4952]: I1122 03:39:29.434708 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-48r6j" Nov 22 03:39:29 crc kubenswrapper[4952]: I1122 03:39:29.435383 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-48r6j" Nov 22 03:39:29 crc kubenswrapper[4952]: I1122 03:39:29.510663 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-48r6j" Nov 22 03:39:29 crc kubenswrapper[4952]: I1122 03:39:29.998170 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-48r6j" Nov 22 03:39:31 crc kubenswrapper[4952]: I1122 03:39:31.669997 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-48r6j"] Nov 22 03:39:32 crc kubenswrapper[4952]: I1122 03:39:32.962779 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-48r6j" podUID="2964d345-d05f-478d-afa8-774d76d40e94" containerName="registry-server" containerID="cri-o://05dd793f442d745dcbbffddb65544369dcffb7c43c1a710bde924853ac131959" gracePeriod=2 Nov 22 03:39:33 crc kubenswrapper[4952]: I1122 03:39:33.574213 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48r6j" Nov 22 03:39:33 crc kubenswrapper[4952]: I1122 03:39:33.736793 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf6ml\" (UniqueName: \"kubernetes.io/projected/2964d345-d05f-478d-afa8-774d76d40e94-kube-api-access-xf6ml\") pod \"2964d345-d05f-478d-afa8-774d76d40e94\" (UID: \"2964d345-d05f-478d-afa8-774d76d40e94\") " Nov 22 03:39:33 crc kubenswrapper[4952]: I1122 03:39:33.736977 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2964d345-d05f-478d-afa8-774d76d40e94-utilities\") pod \"2964d345-d05f-478d-afa8-774d76d40e94\" (UID: \"2964d345-d05f-478d-afa8-774d76d40e94\") " Nov 22 03:39:33 crc kubenswrapper[4952]: I1122 03:39:33.737016 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2964d345-d05f-478d-afa8-774d76d40e94-catalog-content\") pod \"2964d345-d05f-478d-afa8-774d76d40e94\" (UID: \"2964d345-d05f-478d-afa8-774d76d40e94\") " Nov 22 03:39:33 crc kubenswrapper[4952]: I1122 03:39:33.738326 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2964d345-d05f-478d-afa8-774d76d40e94-utilities" (OuterVolumeSpecName: "utilities") pod "2964d345-d05f-478d-afa8-774d76d40e94" (UID: "2964d345-d05f-478d-afa8-774d76d40e94"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:39:33 crc kubenswrapper[4952]: I1122 03:39:33.747178 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2964d345-d05f-478d-afa8-774d76d40e94-kube-api-access-xf6ml" (OuterVolumeSpecName: "kube-api-access-xf6ml") pod "2964d345-d05f-478d-afa8-774d76d40e94" (UID: "2964d345-d05f-478d-afa8-774d76d40e94"). InnerVolumeSpecName "kube-api-access-xf6ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:39:33 crc kubenswrapper[4952]: I1122 03:39:33.764868 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2964d345-d05f-478d-afa8-774d76d40e94-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2964d345-d05f-478d-afa8-774d76d40e94" (UID: "2964d345-d05f-478d-afa8-774d76d40e94"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:39:33 crc kubenswrapper[4952]: I1122 03:39:33.839833 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2964d345-d05f-478d-afa8-774d76d40e94-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:39:33 crc kubenswrapper[4952]: I1122 03:39:33.839943 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2964d345-d05f-478d-afa8-774d76d40e94-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:39:33 crc kubenswrapper[4952]: I1122 03:39:33.839970 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf6ml\" (UniqueName: \"kubernetes.io/projected/2964d345-d05f-478d-afa8-774d76d40e94-kube-api-access-xf6ml\") on node \"crc\" DevicePath \"\"" Nov 22 03:39:33 crc kubenswrapper[4952]: I1122 03:39:33.973123 4952 generic.go:334] "Generic (PLEG): container finished" podID="2964d345-d05f-478d-afa8-774d76d40e94" containerID="05dd793f442d745dcbbffddb65544369dcffb7c43c1a710bde924853ac131959" exitCode=0 Nov 22 03:39:33 crc kubenswrapper[4952]: I1122 03:39:33.973159 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48r6j" event={"ID":"2964d345-d05f-478d-afa8-774d76d40e94","Type":"ContainerDied","Data":"05dd793f442d745dcbbffddb65544369dcffb7c43c1a710bde924853ac131959"} Nov 22 03:39:33 crc kubenswrapper[4952]: I1122 03:39:33.973183 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48r6j" event={"ID":"2964d345-d05f-478d-afa8-774d76d40e94","Type":"ContainerDied","Data":"7506d039af2a648464215f03be3dec51d02ffcbc2bfedda2293a1b490a54ea58"} Nov 22 03:39:33 crc kubenswrapper[4952]: I1122 03:39:33.973200 4952 scope.go:117] "RemoveContainer" containerID="05dd793f442d745dcbbffddb65544369dcffb7c43c1a710bde924853ac131959" Nov 22 03:39:33 crc kubenswrapper[4952]: I1122 03:39:33.973299 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48r6j" Nov 22 03:39:34 crc kubenswrapper[4952]: I1122 03:39:34.004346 4952 scope.go:117] "RemoveContainer" containerID="250caac44c94dd5b5f099e53332e93bbdaa2f1de437bb11aad23401eb49d3573" Nov 22 03:39:34 crc kubenswrapper[4952]: I1122 03:39:34.018111 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-48r6j"] Nov 22 03:39:34 crc kubenswrapper[4952]: I1122 03:39:34.026190 4952 scope.go:117] "RemoveContainer" containerID="c577014d85f14dbb0a85b859b853422f4e9b14a8060f79660b9e00e7163914f2" Nov 22 03:39:34 crc kubenswrapper[4952]: I1122 03:39:34.030050 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-48r6j"] Nov 22 03:39:34 crc kubenswrapper[4952]: I1122 03:39:34.085141 4952 scope.go:117] "RemoveContainer" containerID="05dd793f442d745dcbbffddb65544369dcffb7c43c1a710bde924853ac131959" Nov 22 03:39:34 crc kubenswrapper[4952]: E1122 03:39:34.085843 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05dd793f442d745dcbbffddb65544369dcffb7c43c1a710bde924853ac131959\": container with ID starting with 05dd793f442d745dcbbffddb65544369dcffb7c43c1a710bde924853ac131959 not found: ID does not exist" containerID="05dd793f442d745dcbbffddb65544369dcffb7c43c1a710bde924853ac131959" Nov 22 03:39:34 crc kubenswrapper[4952]: I1122 03:39:34.085879 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05dd793f442d745dcbbffddb65544369dcffb7c43c1a710bde924853ac131959"} err="failed to get container status \"05dd793f442d745dcbbffddb65544369dcffb7c43c1a710bde924853ac131959\": rpc error: code = NotFound desc = could not find container \"05dd793f442d745dcbbffddb65544369dcffb7c43c1a710bde924853ac131959\": container with ID starting with 05dd793f442d745dcbbffddb65544369dcffb7c43c1a710bde924853ac131959 not found: ID does not exist" Nov 22 03:39:34 crc kubenswrapper[4952]: I1122 03:39:34.085906 4952 scope.go:117] "RemoveContainer" containerID="250caac44c94dd5b5f099e53332e93bbdaa2f1de437bb11aad23401eb49d3573" Nov 22 03:39:34 crc kubenswrapper[4952]: E1122 03:39:34.086473 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"250caac44c94dd5b5f099e53332e93bbdaa2f1de437bb11aad23401eb49d3573\": container with ID starting with 250caac44c94dd5b5f099e53332e93bbdaa2f1de437bb11aad23401eb49d3573 not found: ID does not exist" containerID="250caac44c94dd5b5f099e53332e93bbdaa2f1de437bb11aad23401eb49d3573" Nov 22 03:39:34 crc kubenswrapper[4952]: I1122 03:39:34.086569 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"250caac44c94dd5b5f099e53332e93bbdaa2f1de437bb11aad23401eb49d3573"} err="failed to get container status \"250caac44c94dd5b5f099e53332e93bbdaa2f1de437bb11aad23401eb49d3573\": rpc error: code = NotFound desc = could not find container \"250caac44c94dd5b5f099e53332e93bbdaa2f1de437bb11aad23401eb49d3573\": container with ID starting with 250caac44c94dd5b5f099e53332e93bbdaa2f1de437bb11aad23401eb49d3573 not found: ID does not exist" Nov 22 03:39:34 crc kubenswrapper[4952]: I1122 03:39:34.086615 4952 scope.go:117] "RemoveContainer" containerID="c577014d85f14dbb0a85b859b853422f4e9b14a8060f79660b9e00e7163914f2" Nov 22 03:39:34 crc kubenswrapper[4952]: E1122 03:39:34.087081 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c577014d85f14dbb0a85b859b853422f4e9b14a8060f79660b9e00e7163914f2\": container with ID starting with c577014d85f14dbb0a85b859b853422f4e9b14a8060f79660b9e00e7163914f2 not found: ID does not exist" containerID="c577014d85f14dbb0a85b859b853422f4e9b14a8060f79660b9e00e7163914f2" Nov 22 03:39:34 crc kubenswrapper[4952]: I1122 03:39:34.087127 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c577014d85f14dbb0a85b859b853422f4e9b14a8060f79660b9e00e7163914f2"} err="failed to get container status \"c577014d85f14dbb0a85b859b853422f4e9b14a8060f79660b9e00e7163914f2\": rpc error: code = NotFound desc = could not find container \"c577014d85f14dbb0a85b859b853422f4e9b14a8060f79660b9e00e7163914f2\": container with ID starting with c577014d85f14dbb0a85b859b853422f4e9b14a8060f79660b9e00e7163914f2 not found: ID does not exist" Nov 22 03:39:34 crc kubenswrapper[4952]: I1122 03:39:34.548975 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2964d345-d05f-478d-afa8-774d76d40e94" path="/var/lib/kubelet/pods/2964d345-d05f-478d-afa8-774d76d40e94/volumes" Nov 22 03:39:35 crc kubenswrapper[4952]: I1122 03:39:35.808647 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gwmp9" Nov 22 03:39:35 crc kubenswrapper[4952]: I1122 03:39:35.808727 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gwmp9" Nov 22 03:39:35 crc kubenswrapper[4952]: I1122 03:39:35.878676 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gwmp9" Nov 22 03:39:36 crc kubenswrapper[4952]: I1122 03:39:36.041795 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gwmp9" Nov 22 03:39:37 crc kubenswrapper[4952]: I1122 03:39:37.078466 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gwmp9"] Nov 22 03:39:38 crc kubenswrapper[4952]: I1122 03:39:38.014722 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gwmp9" podUID="ef60d1be-3222-4eba-88f7-d35618db2083" containerName="registry-server" containerID="cri-o://f14e972f2f7ea373c4801243085fa011bee7bad7d1f860b3e192546241494e44" gracePeriod=2 Nov 22 03:39:38 crc kubenswrapper[4952]: I1122 03:39:38.404930 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwmp9" Nov 22 03:39:38 crc kubenswrapper[4952]: I1122 03:39:38.449174 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef60d1be-3222-4eba-88f7-d35618db2083-utilities\") pod \"ef60d1be-3222-4eba-88f7-d35618db2083\" (UID: \"ef60d1be-3222-4eba-88f7-d35618db2083\") " Nov 22 03:39:38 crc kubenswrapper[4952]: I1122 03:39:38.449417 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f47rz\" (UniqueName: \"kubernetes.io/projected/ef60d1be-3222-4eba-88f7-d35618db2083-kube-api-access-f47rz\") pod \"ef60d1be-3222-4eba-88f7-d35618db2083\" (UID: \"ef60d1be-3222-4eba-88f7-d35618db2083\") " Nov 22 03:39:38 crc kubenswrapper[4952]: I1122 03:39:38.449453 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef60d1be-3222-4eba-88f7-d35618db2083-catalog-content\") pod \"ef60d1be-3222-4eba-88f7-d35618db2083\" (UID: \"ef60d1be-3222-4eba-88f7-d35618db2083\") " Nov 22 03:39:38 crc kubenswrapper[4952]: I1122 03:39:38.453175 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef60d1be-3222-4eba-88f7-d35618db2083-utilities" (OuterVolumeSpecName: "utilities") pod "ef60d1be-3222-4eba-88f7-d35618db2083" (UID: "ef60d1be-3222-4eba-88f7-d35618db2083"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:39:38 crc kubenswrapper[4952]: I1122 03:39:38.469425 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef60d1be-3222-4eba-88f7-d35618db2083-kube-api-access-f47rz" (OuterVolumeSpecName: "kube-api-access-f47rz") pod "ef60d1be-3222-4eba-88f7-d35618db2083" (UID: "ef60d1be-3222-4eba-88f7-d35618db2083"). InnerVolumeSpecName "kube-api-access-f47rz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:39:38 crc kubenswrapper[4952]: I1122 03:39:38.551125 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef60d1be-3222-4eba-88f7-d35618db2083-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:39:38 crc kubenswrapper[4952]: I1122 03:39:38.551148 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f47rz\" (UniqueName: \"kubernetes.io/projected/ef60d1be-3222-4eba-88f7-d35618db2083-kube-api-access-f47rz\") on node \"crc\" DevicePath \"\"" Nov 22 03:39:38 crc kubenswrapper[4952]: I1122 03:39:38.996957 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef60d1be-3222-4eba-88f7-d35618db2083-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef60d1be-3222-4eba-88f7-d35618db2083" (UID: "ef60d1be-3222-4eba-88f7-d35618db2083"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:39:39 crc kubenswrapper[4952]: I1122 03:39:39.027086 4952 generic.go:334] "Generic (PLEG): container finished" podID="ef60d1be-3222-4eba-88f7-d35618db2083" containerID="f14e972f2f7ea373c4801243085fa011bee7bad7d1f860b3e192546241494e44" exitCode=0 Nov 22 03:39:39 crc kubenswrapper[4952]: I1122 03:39:39.027140 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwmp9" Nov 22 03:39:39 crc kubenswrapper[4952]: I1122 03:39:39.027143 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwmp9" event={"ID":"ef60d1be-3222-4eba-88f7-d35618db2083","Type":"ContainerDied","Data":"f14e972f2f7ea373c4801243085fa011bee7bad7d1f860b3e192546241494e44"} Nov 22 03:39:39 crc kubenswrapper[4952]: I1122 03:39:39.027214 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwmp9" event={"ID":"ef60d1be-3222-4eba-88f7-d35618db2083","Type":"ContainerDied","Data":"67ad9b33b8bb2f9bd7be4279d02f38e4caae0343bd95fe0be7d7978d95e4e0f4"} Nov 22 03:39:39 crc kubenswrapper[4952]: I1122 03:39:39.027242 4952 scope.go:117] "RemoveContainer" containerID="f14e972f2f7ea373c4801243085fa011bee7bad7d1f860b3e192546241494e44" Nov 22 03:39:39 crc kubenswrapper[4952]: I1122 03:39:39.060508 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef60d1be-3222-4eba-88f7-d35618db2083-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:39:39 crc kubenswrapper[4952]: I1122 03:39:39.069615 4952 scope.go:117] "RemoveContainer" containerID="80a1374ac4517cb373bdc285c345888d03797c5e9194de12fd93fbf98bf075ac" Nov 22 03:39:39 crc kubenswrapper[4952]: I1122 03:39:39.077418 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gwmp9"] Nov 22 03:39:39 crc kubenswrapper[4952]: I1122 03:39:39.086326 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gwmp9"] Nov 22 03:39:39 crc kubenswrapper[4952]: I1122 03:39:39.087276 4952 scope.go:117] "RemoveContainer" containerID="987cf9ff5936d9ae291d4a1dda4fb560003019aca4c2defc06be572b237f7164" Nov 22 03:39:39 crc kubenswrapper[4952]: I1122 03:39:39.129139 4952 scope.go:117] "RemoveContainer" containerID="f14e972f2f7ea373c4801243085fa011bee7bad7d1f860b3e192546241494e44" Nov 22 03:39:39 crc kubenswrapper[4952]: E1122 03:39:39.129713 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f14e972f2f7ea373c4801243085fa011bee7bad7d1f860b3e192546241494e44\": container with ID starting with f14e972f2f7ea373c4801243085fa011bee7bad7d1f860b3e192546241494e44 not found: ID does not exist" containerID="f14e972f2f7ea373c4801243085fa011bee7bad7d1f860b3e192546241494e44" Nov 22 03:39:39 crc kubenswrapper[4952]: I1122 03:39:39.129755 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f14e972f2f7ea373c4801243085fa011bee7bad7d1f860b3e192546241494e44"} err="failed to get container status \"f14e972f2f7ea373c4801243085fa011bee7bad7d1f860b3e192546241494e44\": rpc error: code = NotFound desc = could not find container \"f14e972f2f7ea373c4801243085fa011bee7bad7d1f860b3e192546241494e44\": container with ID starting with f14e972f2f7ea373c4801243085fa011bee7bad7d1f860b3e192546241494e44 not found: ID does not exist" Nov 22 03:39:39 crc kubenswrapper[4952]: I1122 03:39:39.129781 4952 scope.go:117] "RemoveContainer" containerID="80a1374ac4517cb373bdc285c345888d03797c5e9194de12fd93fbf98bf075ac" Nov 22 03:39:39 crc kubenswrapper[4952]: E1122 03:39:39.130051 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80a1374ac4517cb373bdc285c345888d03797c5e9194de12fd93fbf98bf075ac\": container with ID starting with 80a1374ac4517cb373bdc285c345888d03797c5e9194de12fd93fbf98bf075ac not found: ID does not exist" containerID="80a1374ac4517cb373bdc285c345888d03797c5e9194de12fd93fbf98bf075ac" Nov 22 03:39:39 crc kubenswrapper[4952]: I1122 03:39:39.130076 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80a1374ac4517cb373bdc285c345888d03797c5e9194de12fd93fbf98bf075ac"} err="failed to get container status \"80a1374ac4517cb373bdc285c345888d03797c5e9194de12fd93fbf98bf075ac\": rpc error: code = NotFound desc = could not find container \"80a1374ac4517cb373bdc285c345888d03797c5e9194de12fd93fbf98bf075ac\": container with ID starting with 80a1374ac4517cb373bdc285c345888d03797c5e9194de12fd93fbf98bf075ac not found: ID does not exist" Nov 22 03:39:39 crc kubenswrapper[4952]: I1122 03:39:39.130097 4952 scope.go:117] "RemoveContainer" containerID="987cf9ff5936d9ae291d4a1dda4fb560003019aca4c2defc06be572b237f7164" Nov 22 03:39:39 crc kubenswrapper[4952]: E1122 03:39:39.130894 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"987cf9ff5936d9ae291d4a1dda4fb560003019aca4c2defc06be572b237f7164\": container with ID starting with 987cf9ff5936d9ae291d4a1dda4fb560003019aca4c2defc06be572b237f7164 not found: ID does not exist" containerID="987cf9ff5936d9ae291d4a1dda4fb560003019aca4c2defc06be572b237f7164" Nov 22 03:39:39 crc kubenswrapper[4952]: I1122 03:39:39.130924 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"987cf9ff5936d9ae291d4a1dda4fb560003019aca4c2defc06be572b237f7164"} err="failed to get container status \"987cf9ff5936d9ae291d4a1dda4fb560003019aca4c2defc06be572b237f7164\": rpc error: code = NotFound desc = could not find container \"987cf9ff5936d9ae291d4a1dda4fb560003019aca4c2defc06be572b237f7164\": container with ID starting with 987cf9ff5936d9ae291d4a1dda4fb560003019aca4c2defc06be572b237f7164 not found: ID does not exist" Nov 22 03:39:40 crc kubenswrapper[4952]: I1122 03:39:40.546353 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef60d1be-3222-4eba-88f7-d35618db2083" path="/var/lib/kubelet/pods/ef60d1be-3222-4eba-88f7-d35618db2083/volumes" Nov 22 03:40:37 crc kubenswrapper[4952]: I1122 03:40:37.939826 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fqlm4"] Nov 22 03:40:37 crc kubenswrapper[4952]: E1122 03:40:37.941092 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2964d345-d05f-478d-afa8-774d76d40e94" containerName="registry-server" Nov 22 03:40:37 crc kubenswrapper[4952]: I1122 03:40:37.941113 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="2964d345-d05f-478d-afa8-774d76d40e94" containerName="registry-server" Nov 22 03:40:37 crc kubenswrapper[4952]: E1122 03:40:37.941138 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2964d345-d05f-478d-afa8-774d76d40e94" containerName="extract-content" Nov 22 03:40:37 crc kubenswrapper[4952]: I1122 03:40:37.941152 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="2964d345-d05f-478d-afa8-774d76d40e94" containerName="extract-content" Nov 22 03:40:37 crc kubenswrapper[4952]: E1122 03:40:37.941175 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2964d345-d05f-478d-afa8-774d76d40e94" containerName="extract-utilities" Nov 22 03:40:37 crc kubenswrapper[4952]: I1122 03:40:37.941186 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="2964d345-d05f-478d-afa8-774d76d40e94" containerName="extract-utilities" Nov 22 03:40:37 crc kubenswrapper[4952]: E1122 03:40:37.941210 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef60d1be-3222-4eba-88f7-d35618db2083" containerName="extract-utilities" Nov 22 03:40:37 crc kubenswrapper[4952]: I1122 03:40:37.941222 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef60d1be-3222-4eba-88f7-d35618db2083" containerName="extract-utilities" Nov 22 03:40:37 crc kubenswrapper[4952]: E1122 03:40:37.941267 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef60d1be-3222-4eba-88f7-d35618db2083" containerName="registry-server" Nov 22 03:40:37 crc kubenswrapper[4952]: I1122 03:40:37.941277 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef60d1be-3222-4eba-88f7-d35618db2083" containerName="registry-server" Nov 22 03:40:37 crc kubenswrapper[4952]: E1122 03:40:37.941295 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef60d1be-3222-4eba-88f7-d35618db2083" containerName="extract-content" Nov 22 03:40:37 crc kubenswrapper[4952]: I1122 03:40:37.941306 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef60d1be-3222-4eba-88f7-d35618db2083" containerName="extract-content" Nov 22 03:40:37 crc kubenswrapper[4952]: I1122 03:40:37.941592 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef60d1be-3222-4eba-88f7-d35618db2083" containerName="registry-server" Nov 22 03:40:37 crc kubenswrapper[4952]: I1122 03:40:37.941634 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="2964d345-d05f-478d-afa8-774d76d40e94" containerName="registry-server" Nov 22 03:40:37 crc kubenswrapper[4952]: I1122 03:40:37.943264 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqlm4" Nov 22 03:40:37 crc kubenswrapper[4952]: I1122 03:40:37.956731 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fqlm4"] Nov 22 03:40:38 crc kubenswrapper[4952]: I1122 03:40:38.033147 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sfph\" (UniqueName: \"kubernetes.io/projected/b46c470f-1754-41de-a142-28e4f39c5a38-kube-api-access-7sfph\") pod \"redhat-operators-fqlm4\" (UID: \"b46c470f-1754-41de-a142-28e4f39c5a38\") " pod="openshift-marketplace/redhat-operators-fqlm4" Nov 22 03:40:38 crc kubenswrapper[4952]: I1122 03:40:38.033203 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b46c470f-1754-41de-a142-28e4f39c5a38-catalog-content\") pod \"redhat-operators-fqlm4\" (UID: \"b46c470f-1754-41de-a142-28e4f39c5a38\") " pod="openshift-marketplace/redhat-operators-fqlm4" Nov 22 03:40:38 crc kubenswrapper[4952]: I1122 03:40:38.033234 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b46c470f-1754-41de-a142-28e4f39c5a38-utilities\") pod \"redhat-operators-fqlm4\" (UID: \"b46c470f-1754-41de-a142-28e4f39c5a38\") " pod="openshift-marketplace/redhat-operators-fqlm4" Nov 22 03:40:38 crc kubenswrapper[4952]: I1122 03:40:38.134873 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b46c470f-1754-41de-a142-28e4f39c5a38-utilities\") pod \"redhat-operators-fqlm4\" (UID: \"b46c470f-1754-41de-a142-28e4f39c5a38\") " pod="openshift-marketplace/redhat-operators-fqlm4" Nov 22 03:40:38 crc kubenswrapper[4952]: I1122 03:40:38.135056 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sfph\" (UniqueName: \"kubernetes.io/projected/b46c470f-1754-41de-a142-28e4f39c5a38-kube-api-access-7sfph\") pod \"redhat-operators-fqlm4\" (UID: \"b46c470f-1754-41de-a142-28e4f39c5a38\") " pod="openshift-marketplace/redhat-operators-fqlm4" Nov 22 03:40:38 crc kubenswrapper[4952]: I1122 03:40:38.135105 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b46c470f-1754-41de-a142-28e4f39c5a38-catalog-content\") pod \"redhat-operators-fqlm4\" (UID: \"b46c470f-1754-41de-a142-28e4f39c5a38\") " pod="openshift-marketplace/redhat-operators-fqlm4" Nov 22 03:40:38 crc kubenswrapper[4952]: I1122 03:40:38.135472 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b46c470f-1754-41de-a142-28e4f39c5a38-utilities\") pod \"redhat-operators-fqlm4\" (UID: \"b46c470f-1754-41de-a142-28e4f39c5a38\") " pod="openshift-marketplace/redhat-operators-fqlm4" Nov 22 03:40:38 crc kubenswrapper[4952]: I1122 03:40:38.135582 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b46c470f-1754-41de-a142-28e4f39c5a38-catalog-content\") pod \"redhat-operators-fqlm4\" (UID: \"b46c470f-1754-41de-a142-28e4f39c5a38\") " pod="openshift-marketplace/redhat-operators-fqlm4" Nov 22 03:40:38 crc kubenswrapper[4952]: I1122 03:40:38.157809 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sfph\" (UniqueName: \"kubernetes.io/projected/b46c470f-1754-41de-a142-28e4f39c5a38-kube-api-access-7sfph\") pod \"redhat-operators-fqlm4\" (UID: \"b46c470f-1754-41de-a142-28e4f39c5a38\") " pod="openshift-marketplace/redhat-operators-fqlm4" Nov 22 03:40:38 crc kubenswrapper[4952]: I1122 03:40:38.278446 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqlm4" Nov 22 03:40:38 crc kubenswrapper[4952]: I1122 03:40:38.746093 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fqlm4"] Nov 22 03:40:39 crc kubenswrapper[4952]: I1122 03:40:39.605869 4952 generic.go:334] "Generic (PLEG): container finished" podID="b46c470f-1754-41de-a142-28e4f39c5a38" containerID="4a7a3a711834aef52d71a09ad2f1131a6dadf6faa7cdc17f42b1c5fc96bc5a66" exitCode=0 Nov 22 03:40:39 crc kubenswrapper[4952]: I1122 03:40:39.605940 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqlm4" event={"ID":"b46c470f-1754-41de-a142-28e4f39c5a38","Type":"ContainerDied","Data":"4a7a3a711834aef52d71a09ad2f1131a6dadf6faa7cdc17f42b1c5fc96bc5a66"} Nov 22 03:40:39 crc kubenswrapper[4952]: I1122 03:40:39.606112 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqlm4" event={"ID":"b46c470f-1754-41de-a142-28e4f39c5a38","Type":"ContainerStarted","Data":"238f453b1520bbffe61c4679703b7b60c7f776d530786c3e0eb82c888a1faf1b"} Nov 22 03:40:40 crc kubenswrapper[4952]: I1122 03:40:40.618655 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqlm4" event={"ID":"b46c470f-1754-41de-a142-28e4f39c5a38","Type":"ContainerStarted","Data":"9d77c3e2f39e836d48ed712e9ecf74d8c51690d50fcf60a44f8178d45de4bb4e"} Nov 22 03:40:41 crc kubenswrapper[4952]: I1122 03:40:41.634069 4952 generic.go:334] "Generic (PLEG): container finished" podID="b46c470f-1754-41de-a142-28e4f39c5a38" containerID="9d77c3e2f39e836d48ed712e9ecf74d8c51690d50fcf60a44f8178d45de4bb4e" exitCode=0 Nov 22 03:40:41 crc kubenswrapper[4952]: I1122 03:40:41.634157 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqlm4" event={"ID":"b46c470f-1754-41de-a142-28e4f39c5a38","Type":"ContainerDied","Data":"9d77c3e2f39e836d48ed712e9ecf74d8c51690d50fcf60a44f8178d45de4bb4e"} Nov 22 03:40:42 crc kubenswrapper[4952]: I1122 03:40:42.652429 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqlm4" event={"ID":"b46c470f-1754-41de-a142-28e4f39c5a38","Type":"ContainerStarted","Data":"509971c46e735a8ef24cafc62d98a8f8a28eeb40c8c62d0dea087b9194988e3c"} Nov 22 03:40:42 crc kubenswrapper[4952]: I1122 03:40:42.684915 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fqlm4" podStartSLOduration=3.199738468 podStartE2EDuration="5.684870524s" podCreationTimestamp="2025-11-22 03:40:37 +0000 UTC" firstStartedPulling="2025-11-22 03:40:39.609572016 +0000 UTC m=+2803.915589339" lastFinishedPulling="2025-11-22 03:40:42.094704112 +0000 UTC m=+2806.400721395" observedRunningTime="2025-11-22 03:40:42.679670416 +0000 UTC m=+2806.985687729" watchObservedRunningTime="2025-11-22 03:40:42.684870524 +0000 UTC m=+2806.990887807" Nov 22 03:40:48 crc kubenswrapper[4952]: I1122 03:40:48.278574 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fqlm4" Nov 22 03:40:48 crc kubenswrapper[4952]: I1122 03:40:48.279065 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fqlm4" Nov 22 03:40:49 crc kubenswrapper[4952]: I1122 03:40:49.329730 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fqlm4" podUID="b46c470f-1754-41de-a142-28e4f39c5a38" containerName="registry-server" probeResult="failure" output=< Nov 22 03:40:49 crc kubenswrapper[4952]: timeout: failed to connect service ":50051" within 1s Nov 22 03:40:49 crc kubenswrapper[4952]: > Nov 22 03:40:58 crc kubenswrapper[4952]: I1122 03:40:58.328740 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fqlm4" Nov 22 03:40:58 crc kubenswrapper[4952]: I1122 03:40:58.344535 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:40:58 crc kubenswrapper[4952]: I1122 03:40:58.344627 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:40:58 crc kubenswrapper[4952]: I1122 03:40:58.382015 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fqlm4" Nov 22 03:40:58 crc kubenswrapper[4952]: I1122 03:40:58.571075 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fqlm4"] Nov 22 03:40:59 crc kubenswrapper[4952]: I1122 03:40:59.815280 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fqlm4" podUID="b46c470f-1754-41de-a142-28e4f39c5a38" containerName="registry-server" containerID="cri-o://509971c46e735a8ef24cafc62d98a8f8a28eeb40c8c62d0dea087b9194988e3c" gracePeriod=2 Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.316329 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqlm4" Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.411332 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sfph\" (UniqueName: \"kubernetes.io/projected/b46c470f-1754-41de-a142-28e4f39c5a38-kube-api-access-7sfph\") pod \"b46c470f-1754-41de-a142-28e4f39c5a38\" (UID: \"b46c470f-1754-41de-a142-28e4f39c5a38\") " Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.411662 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b46c470f-1754-41de-a142-28e4f39c5a38-catalog-content\") pod \"b46c470f-1754-41de-a142-28e4f39c5a38\" (UID: \"b46c470f-1754-41de-a142-28e4f39c5a38\") " Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.411745 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b46c470f-1754-41de-a142-28e4f39c5a38-utilities\") pod \"b46c470f-1754-41de-a142-28e4f39c5a38\" (UID: \"b46c470f-1754-41de-a142-28e4f39c5a38\") " Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.412631 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b46c470f-1754-41de-a142-28e4f39c5a38-utilities" (OuterVolumeSpecName: "utilities") pod "b46c470f-1754-41de-a142-28e4f39c5a38" (UID: "b46c470f-1754-41de-a142-28e4f39c5a38"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.421034 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b46c470f-1754-41de-a142-28e4f39c5a38-kube-api-access-7sfph" (OuterVolumeSpecName: "kube-api-access-7sfph") pod "b46c470f-1754-41de-a142-28e4f39c5a38" (UID: "b46c470f-1754-41de-a142-28e4f39c5a38"). InnerVolumeSpecName "kube-api-access-7sfph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.512825 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b46c470f-1754-41de-a142-28e4f39c5a38-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b46c470f-1754-41de-a142-28e4f39c5a38" (UID: "b46c470f-1754-41de-a142-28e4f39c5a38"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.514475 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b46c470f-1754-41de-a142-28e4f39c5a38-catalog-content\") pod \"b46c470f-1754-41de-a142-28e4f39c5a38\" (UID: \"b46c470f-1754-41de-a142-28e4f39c5a38\") " Nov 22 03:41:00 crc kubenswrapper[4952]: W1122 03:41:00.514661 4952 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/b46c470f-1754-41de-a142-28e4f39c5a38/volumes/kubernetes.io~empty-dir/catalog-content Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.514701 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b46c470f-1754-41de-a142-28e4f39c5a38-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b46c470f-1754-41de-a142-28e4f39c5a38" (UID: "b46c470f-1754-41de-a142-28e4f39c5a38"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.515407 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b46c470f-1754-41de-a142-28e4f39c5a38-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.515500 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sfph\" (UniqueName: \"kubernetes.io/projected/b46c470f-1754-41de-a142-28e4f39c5a38-kube-api-access-7sfph\") on node \"crc\" DevicePath \"\"" Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.515583 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b46c470f-1754-41de-a142-28e4f39c5a38-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.829412 4952 generic.go:334] "Generic (PLEG): container finished" podID="b46c470f-1754-41de-a142-28e4f39c5a38" containerID="509971c46e735a8ef24cafc62d98a8f8a28eeb40c8c62d0dea087b9194988e3c" exitCode=0 Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.829475 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqlm4" Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.829473 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqlm4" event={"ID":"b46c470f-1754-41de-a142-28e4f39c5a38","Type":"ContainerDied","Data":"509971c46e735a8ef24cafc62d98a8f8a28eeb40c8c62d0dea087b9194988e3c"} Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.829577 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqlm4" event={"ID":"b46c470f-1754-41de-a142-28e4f39c5a38","Type":"ContainerDied","Data":"238f453b1520bbffe61c4679703b7b60c7f776d530786c3e0eb82c888a1faf1b"} Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.829596 4952 scope.go:117] "RemoveContainer" containerID="509971c46e735a8ef24cafc62d98a8f8a28eeb40c8c62d0dea087b9194988e3c" Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.870038 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fqlm4"] Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.870730 4952 scope.go:117] "RemoveContainer" containerID="9d77c3e2f39e836d48ed712e9ecf74d8c51690d50fcf60a44f8178d45de4bb4e" Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.887063 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fqlm4"] Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.901565 4952 scope.go:117] "RemoveContainer" containerID="4a7a3a711834aef52d71a09ad2f1131a6dadf6faa7cdc17f42b1c5fc96bc5a66" Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.932129 4952 scope.go:117] "RemoveContainer" containerID="509971c46e735a8ef24cafc62d98a8f8a28eeb40c8c62d0dea087b9194988e3c" Nov 22 03:41:00 crc kubenswrapper[4952]: E1122 03:41:00.932852 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"509971c46e735a8ef24cafc62d98a8f8a28eeb40c8c62d0dea087b9194988e3c\": container with ID starting with 509971c46e735a8ef24cafc62d98a8f8a28eeb40c8c62d0dea087b9194988e3c not found: ID does not exist" containerID="509971c46e735a8ef24cafc62d98a8f8a28eeb40c8c62d0dea087b9194988e3c" Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.932909 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"509971c46e735a8ef24cafc62d98a8f8a28eeb40c8c62d0dea087b9194988e3c"} err="failed to get container status \"509971c46e735a8ef24cafc62d98a8f8a28eeb40c8c62d0dea087b9194988e3c\": rpc error: code = NotFound desc = could not find container \"509971c46e735a8ef24cafc62d98a8f8a28eeb40c8c62d0dea087b9194988e3c\": container with ID starting with 509971c46e735a8ef24cafc62d98a8f8a28eeb40c8c62d0dea087b9194988e3c not found: ID does not exist" Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.932947 4952 scope.go:117] "RemoveContainer" containerID="9d77c3e2f39e836d48ed712e9ecf74d8c51690d50fcf60a44f8178d45de4bb4e" Nov 22 03:41:00 crc kubenswrapper[4952]: E1122 03:41:00.933306 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d77c3e2f39e836d48ed712e9ecf74d8c51690d50fcf60a44f8178d45de4bb4e\": container with ID starting with 9d77c3e2f39e836d48ed712e9ecf74d8c51690d50fcf60a44f8178d45de4bb4e not found: ID does not exist" containerID="9d77c3e2f39e836d48ed712e9ecf74d8c51690d50fcf60a44f8178d45de4bb4e" Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.933330 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d77c3e2f39e836d48ed712e9ecf74d8c51690d50fcf60a44f8178d45de4bb4e"} err="failed to get container status \"9d77c3e2f39e836d48ed712e9ecf74d8c51690d50fcf60a44f8178d45de4bb4e\": rpc error: code = NotFound desc = could not find container \"9d77c3e2f39e836d48ed712e9ecf74d8c51690d50fcf60a44f8178d45de4bb4e\": container with ID starting with 9d77c3e2f39e836d48ed712e9ecf74d8c51690d50fcf60a44f8178d45de4bb4e not found: ID does not exist" Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.933345 4952 scope.go:117] "RemoveContainer" containerID="4a7a3a711834aef52d71a09ad2f1131a6dadf6faa7cdc17f42b1c5fc96bc5a66" Nov 22 03:41:00 crc kubenswrapper[4952]: E1122 03:41:00.933821 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a7a3a711834aef52d71a09ad2f1131a6dadf6faa7cdc17f42b1c5fc96bc5a66\": container with ID starting with 4a7a3a711834aef52d71a09ad2f1131a6dadf6faa7cdc17f42b1c5fc96bc5a66 not found: ID does not exist" containerID="4a7a3a711834aef52d71a09ad2f1131a6dadf6faa7cdc17f42b1c5fc96bc5a66" Nov 22 03:41:00 crc kubenswrapper[4952]: I1122 03:41:00.933841 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a7a3a711834aef52d71a09ad2f1131a6dadf6faa7cdc17f42b1c5fc96bc5a66"} err="failed to get container status \"4a7a3a711834aef52d71a09ad2f1131a6dadf6faa7cdc17f42b1c5fc96bc5a66\": rpc error: code = NotFound desc = could not find container \"4a7a3a711834aef52d71a09ad2f1131a6dadf6faa7cdc17f42b1c5fc96bc5a66\": container with ID starting with 4a7a3a711834aef52d71a09ad2f1131a6dadf6faa7cdc17f42b1c5fc96bc5a66 not found: ID does not exist" Nov 22 03:41:02 crc kubenswrapper[4952]: I1122 03:41:02.548700 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b46c470f-1754-41de-a142-28e4f39c5a38" path="/var/lib/kubelet/pods/b46c470f-1754-41de-a142-28e4f39c5a38/volumes" Nov 22 03:41:13 crc kubenswrapper[4952]: I1122 03:41:13.973833 4952 generic.go:334] "Generic (PLEG): container finished" podID="4f2a731f-8431-4769-a78e-6522954dd7b5" containerID="66bffba76a6b16a0856259726bbbeb65bfae3dd5bb00910ccd2e57df206c8d5e" exitCode=0 Nov 22 03:41:13 crc kubenswrapper[4952]: I1122 03:41:13.973977 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" event={"ID":"4f2a731f-8431-4769-a78e-6522954dd7b5","Type":"ContainerDied","Data":"66bffba76a6b16a0856259726bbbeb65bfae3dd5bb00910ccd2e57df206c8d5e"} Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.389056 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.566627 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-libvirt-secret-0\") pod \"4f2a731f-8431-4769-a78e-6522954dd7b5\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.568162 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-ssh-key\") pod \"4f2a731f-8431-4769-a78e-6522954dd7b5\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.568238 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vsvb\" (UniqueName: \"kubernetes.io/projected/4f2a731f-8431-4769-a78e-6522954dd7b5-kube-api-access-2vsvb\") pod \"4f2a731f-8431-4769-a78e-6522954dd7b5\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.568306 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-ceph\") pod \"4f2a731f-8431-4769-a78e-6522954dd7b5\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.568347 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-libvirt-combined-ca-bundle\") pod \"4f2a731f-8431-4769-a78e-6522954dd7b5\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.568455 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-inventory\") pod \"4f2a731f-8431-4769-a78e-6522954dd7b5\" (UID: \"4f2a731f-8431-4769-a78e-6522954dd7b5\") " Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.576249 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "4f2a731f-8431-4769-a78e-6522954dd7b5" (UID: "4f2a731f-8431-4769-a78e-6522954dd7b5"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.576279 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-ceph" (OuterVolumeSpecName: "ceph") pod "4f2a731f-8431-4769-a78e-6522954dd7b5" (UID: "4f2a731f-8431-4769-a78e-6522954dd7b5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.576314 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f2a731f-8431-4769-a78e-6522954dd7b5-kube-api-access-2vsvb" (OuterVolumeSpecName: "kube-api-access-2vsvb") pod "4f2a731f-8431-4769-a78e-6522954dd7b5" (UID: "4f2a731f-8431-4769-a78e-6522954dd7b5"). InnerVolumeSpecName "kube-api-access-2vsvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.610973 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "4f2a731f-8431-4769-a78e-6522954dd7b5" (UID: "4f2a731f-8431-4769-a78e-6522954dd7b5"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.611086 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-inventory" (OuterVolumeSpecName: "inventory") pod "4f2a731f-8431-4769-a78e-6522954dd7b5" (UID: "4f2a731f-8431-4769-a78e-6522954dd7b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.612223 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4f2a731f-8431-4769-a78e-6522954dd7b5" (UID: "4f2a731f-8431-4769-a78e-6522954dd7b5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.671330 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.671370 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vsvb\" (UniqueName: \"kubernetes.io/projected/4f2a731f-8431-4769-a78e-6522954dd7b5-kube-api-access-2vsvb\") on node \"crc\" DevicePath \"\"" Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.671386 4952 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.671399 4952 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.671411 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.671423 4952 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4f2a731f-8431-4769-a78e-6522954dd7b5-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.989927 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" event={"ID":"4f2a731f-8431-4769-a78e-6522954dd7b5","Type":"ContainerDied","Data":"b776853b7699a7d16d7bacde9aed438254976007772fa43b4bbe6a5d0b417f6f"} Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.989975 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b776853b7699a7d16d7bacde9aed438254976007772fa43b4bbe6a5d0b417f6f" Nov 22 03:41:15 crc kubenswrapper[4952]: I1122 03:41:15.990368 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.128972 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz"] Nov 22 03:41:16 crc kubenswrapper[4952]: E1122 03:41:16.130065 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b46c470f-1754-41de-a142-28e4f39c5a38" containerName="registry-server" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.130112 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46c470f-1754-41de-a142-28e4f39c5a38" containerName="registry-server" Nov 22 03:41:16 crc kubenswrapper[4952]: E1122 03:41:16.130149 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b46c470f-1754-41de-a142-28e4f39c5a38" containerName="extract-utilities" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.130166 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46c470f-1754-41de-a142-28e4f39c5a38" containerName="extract-utilities" Nov 22 03:41:16 crc kubenswrapper[4952]: E1122 03:41:16.130216 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b46c470f-1754-41de-a142-28e4f39c5a38" containerName="extract-content" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.130236 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46c470f-1754-41de-a142-28e4f39c5a38" containerName="extract-content" Nov 22 03:41:16 crc kubenswrapper[4952]: E1122 03:41:16.130286 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f2a731f-8431-4769-a78e-6522954dd7b5" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.130305 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2a731f-8431-4769-a78e-6522954dd7b5" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.130762 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="b46c470f-1754-41de-a142-28e4f39c5a38" containerName="registry-server" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.130822 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f2a731f-8431-4769-a78e-6522954dd7b5" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.132257 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.134716 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.135349 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.136141 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.136348 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.136677 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.136893 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fxhm9" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.137372 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.137648 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.137832 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.159068 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz"] Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.301595 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.301661 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j66ph\" (UniqueName: \"kubernetes.io/projected/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-kube-api-access-j66ph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.301685 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.301712 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.301732 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.301766 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.302066 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.302194 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.302343 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.302421 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.302490 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.405838 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.405930 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.405984 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.406089 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.406157 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.406254 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.406310 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.406384 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.406645 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.406758 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j66ph\" (UniqueName: \"kubernetes.io/projected/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-kube-api-access-j66ph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.406811 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.407714 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.407723 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.411766 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.412157 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.413442 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.413476 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.413925 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.415270 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.415923 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.416989 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.429217 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j66ph\" (UniqueName: \"kubernetes.io/projected/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-kube-api-access-j66ph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:16 crc kubenswrapper[4952]: I1122 03:41:16.466073 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:41:17 crc kubenswrapper[4952]: I1122 03:41:17.072068 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz"] Nov 22 03:41:18 crc kubenswrapper[4952]: I1122 03:41:18.014589 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" event={"ID":"80e23ef1-5ca7-4b59-a4a7-17586e0a1989","Type":"ContainerStarted","Data":"493de238a2bae7b1a3bb35ec06e796f60617e50cab5f9442f8bd09fa2375ba0c"} Nov 22 03:41:18 crc kubenswrapper[4952]: I1122 03:41:18.015058 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" event={"ID":"80e23ef1-5ca7-4b59-a4a7-17586e0a1989","Type":"ContainerStarted","Data":"ba52c10cb4ab5ce8bc561b3f97e04afc83c7ff3e3a5bbe7017c2e62dacf475f0"} Nov 22 03:41:18 crc kubenswrapper[4952]: I1122 03:41:18.043849 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" podStartSLOduration=1.596549391 podStartE2EDuration="2.043820696s" podCreationTimestamp="2025-11-22 03:41:16 +0000 UTC" firstStartedPulling="2025-11-22 03:41:17.068620479 +0000 UTC m=+2841.374637752" lastFinishedPulling="2025-11-22 03:41:17.515891764 +0000 UTC m=+2841.821909057" observedRunningTime="2025-11-22 03:41:18.037732754 +0000 UTC m=+2842.343750057" watchObservedRunningTime="2025-11-22 03:41:18.043820696 +0000 UTC m=+2842.349837989" Nov 22 03:41:28 crc kubenswrapper[4952]: I1122 03:41:28.342657 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:41:28 crc kubenswrapper[4952]: I1122 03:41:28.343332 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:41:58 crc kubenswrapper[4952]: I1122 03:41:58.342418 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:41:58 crc kubenswrapper[4952]: I1122 03:41:58.343029 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:41:58 crc kubenswrapper[4952]: I1122 03:41:58.343083 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 03:41:58 crc kubenswrapper[4952]: I1122 03:41:58.343975 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d"} pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:41:58 crc kubenswrapper[4952]: I1122 03:41:58.344056 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" containerID="cri-o://f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" gracePeriod=600 Nov 22 03:41:58 crc kubenswrapper[4952]: E1122 03:41:58.481955 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:41:59 crc kubenswrapper[4952]: I1122 03:41:59.471295 4952 generic.go:334] "Generic (PLEG): container finished" podID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" exitCode=0 Nov 22 03:41:59 crc kubenswrapper[4952]: I1122 03:41:59.471392 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerDied","Data":"f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d"} Nov 22 03:41:59 crc kubenswrapper[4952]: I1122 03:41:59.471647 4952 scope.go:117] "RemoveContainer" containerID="707d2fd1ef1cff3aa64137fd32f9e85383520dcc99249ecbe7832a34839ab912" Nov 22 03:41:59 crc kubenswrapper[4952]: I1122 03:41:59.472264 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:41:59 crc kubenswrapper[4952]: E1122 03:41:59.472561 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:42:14 crc kubenswrapper[4952]: I1122 03:42:14.533392 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:42:14 crc kubenswrapper[4952]: E1122 03:42:14.534337 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:42:26 crc kubenswrapper[4952]: I1122 03:42:26.543434 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:42:26 crc kubenswrapper[4952]: E1122 03:42:26.544490 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:42:39 crc kubenswrapper[4952]: I1122 03:42:39.531631 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:42:39 crc kubenswrapper[4952]: E1122 03:42:39.532457 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:42:51 crc kubenswrapper[4952]: I1122 03:42:51.532337 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:42:51 crc kubenswrapper[4952]: E1122 03:42:51.534059 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:43:02 crc kubenswrapper[4952]: I1122 03:43:02.532148 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:43:02 crc kubenswrapper[4952]: E1122 03:43:02.532956 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:43:13 crc kubenswrapper[4952]: I1122 03:43:13.531448 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:43:13 crc kubenswrapper[4952]: E1122 03:43:13.532687 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:43:28 crc kubenswrapper[4952]: I1122 03:43:28.531354 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:43:28 crc kubenswrapper[4952]: E1122 03:43:28.532163 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:43:41 crc kubenswrapper[4952]: I1122 03:43:41.531678 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:43:41 crc kubenswrapper[4952]: E1122 03:43:41.532493 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:43:52 crc kubenswrapper[4952]: I1122 03:43:52.532507 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:43:52 crc kubenswrapper[4952]: E1122 03:43:52.533621 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:44:06 crc kubenswrapper[4952]: I1122 03:44:06.540922 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:44:06 crc kubenswrapper[4952]: E1122 03:44:06.542405 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:44:14 crc kubenswrapper[4952]: I1122 03:44:14.111949 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qc8vt"] Nov 22 03:44:14 crc kubenswrapper[4952]: I1122 03:44:14.116110 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qc8vt" Nov 22 03:44:14 crc kubenswrapper[4952]: I1122 03:44:14.122750 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4debd57c-fc7c-43b3-bab6-85aa3d6185a5-catalog-content\") pod \"community-operators-qc8vt\" (UID: \"4debd57c-fc7c-43b3-bab6-85aa3d6185a5\") " pod="openshift-marketplace/community-operators-qc8vt" Nov 22 03:44:14 crc kubenswrapper[4952]: I1122 03:44:14.122840 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4debd57c-fc7c-43b3-bab6-85aa3d6185a5-utilities\") pod \"community-operators-qc8vt\" (UID: \"4debd57c-fc7c-43b3-bab6-85aa3d6185a5\") " pod="openshift-marketplace/community-operators-qc8vt" Nov 22 03:44:14 crc kubenswrapper[4952]: I1122 03:44:14.122907 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjs8q\" (UniqueName: \"kubernetes.io/projected/4debd57c-fc7c-43b3-bab6-85aa3d6185a5-kube-api-access-cjs8q\") pod \"community-operators-qc8vt\" (UID: \"4debd57c-fc7c-43b3-bab6-85aa3d6185a5\") " pod="openshift-marketplace/community-operators-qc8vt" Nov 22 03:44:14 crc kubenswrapper[4952]: I1122 03:44:14.156436 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qc8vt"] Nov 22 03:44:14 crc kubenswrapper[4952]: I1122 03:44:14.225137 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4debd57c-fc7c-43b3-bab6-85aa3d6185a5-catalog-content\") pod \"community-operators-qc8vt\" (UID: \"4debd57c-fc7c-43b3-bab6-85aa3d6185a5\") " pod="openshift-marketplace/community-operators-qc8vt" Nov 22 03:44:14 crc kubenswrapper[4952]: I1122 03:44:14.225224 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4debd57c-fc7c-43b3-bab6-85aa3d6185a5-utilities\") pod \"community-operators-qc8vt\" (UID: \"4debd57c-fc7c-43b3-bab6-85aa3d6185a5\") " pod="openshift-marketplace/community-operators-qc8vt" Nov 22 03:44:14 crc kubenswrapper[4952]: I1122 03:44:14.225309 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjs8q\" (UniqueName: \"kubernetes.io/projected/4debd57c-fc7c-43b3-bab6-85aa3d6185a5-kube-api-access-cjs8q\") pod \"community-operators-qc8vt\" (UID: \"4debd57c-fc7c-43b3-bab6-85aa3d6185a5\") " pod="openshift-marketplace/community-operators-qc8vt" Nov 22 03:44:14 crc kubenswrapper[4952]: I1122 03:44:14.226229 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4debd57c-fc7c-43b3-bab6-85aa3d6185a5-catalog-content\") pod \"community-operators-qc8vt\" (UID: \"4debd57c-fc7c-43b3-bab6-85aa3d6185a5\") " pod="openshift-marketplace/community-operators-qc8vt" Nov 22 03:44:14 crc kubenswrapper[4952]: I1122 03:44:14.227037 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4debd57c-fc7c-43b3-bab6-85aa3d6185a5-utilities\") pod \"community-operators-qc8vt\" (UID: \"4debd57c-fc7c-43b3-bab6-85aa3d6185a5\") " pod="openshift-marketplace/community-operators-qc8vt" Nov 22 03:44:14 crc kubenswrapper[4952]: I1122 03:44:14.252005 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjs8q\" (UniqueName: \"kubernetes.io/projected/4debd57c-fc7c-43b3-bab6-85aa3d6185a5-kube-api-access-cjs8q\") pod \"community-operators-qc8vt\" (UID: \"4debd57c-fc7c-43b3-bab6-85aa3d6185a5\") " pod="openshift-marketplace/community-operators-qc8vt" Nov 22 03:44:14 crc kubenswrapper[4952]: I1122 03:44:14.446615 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qc8vt" Nov 22 03:44:14 crc kubenswrapper[4952]: I1122 03:44:14.773920 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qc8vt"] Nov 22 03:44:14 crc kubenswrapper[4952]: I1122 03:44:14.867731 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qc8vt" event={"ID":"4debd57c-fc7c-43b3-bab6-85aa3d6185a5","Type":"ContainerStarted","Data":"829b9a70347ce4d75db3d00fa2fe23f23fe2b0605b25e8016a5e2f4df4426486"} Nov 22 03:44:15 crc kubenswrapper[4952]: I1122 03:44:15.876826 4952 generic.go:334] "Generic (PLEG): container finished" podID="4debd57c-fc7c-43b3-bab6-85aa3d6185a5" containerID="c58e7f44b616beac7769a520d0b87ada38ab1bee9c2be1faa5493194fcc8545d" exitCode=0 Nov 22 03:44:15 crc kubenswrapper[4952]: I1122 03:44:15.876886 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qc8vt" event={"ID":"4debd57c-fc7c-43b3-bab6-85aa3d6185a5","Type":"ContainerDied","Data":"c58e7f44b616beac7769a520d0b87ada38ab1bee9c2be1faa5493194fcc8545d"} Nov 22 03:44:16 crc kubenswrapper[4952]: I1122 03:44:16.890127 4952 generic.go:334] "Generic (PLEG): container finished" podID="4debd57c-fc7c-43b3-bab6-85aa3d6185a5" containerID="2a97d988198d26db45920fbb7e85c2d922603ee037ee761c94b76915d3d065db" exitCode=0 Nov 22 03:44:16 crc kubenswrapper[4952]: I1122 03:44:16.890484 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qc8vt" event={"ID":"4debd57c-fc7c-43b3-bab6-85aa3d6185a5","Type":"ContainerDied","Data":"2a97d988198d26db45920fbb7e85c2d922603ee037ee761c94b76915d3d065db"} Nov 22 03:44:17 crc kubenswrapper[4952]: I1122 03:44:17.903571 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qc8vt" event={"ID":"4debd57c-fc7c-43b3-bab6-85aa3d6185a5","Type":"ContainerStarted","Data":"3fd8ab46c5f1b3fc8d929b41d8370a61d1f7230bacb6773799f8a57573e328ff"} Nov 22 03:44:17 crc kubenswrapper[4952]: I1122 03:44:17.922159 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qc8vt" podStartSLOduration=2.462627852 podStartE2EDuration="3.922135696s" podCreationTimestamp="2025-11-22 03:44:14 +0000 UTC" firstStartedPulling="2025-11-22 03:44:15.878865691 +0000 UTC m=+3020.184882964" lastFinishedPulling="2025-11-22 03:44:17.338373535 +0000 UTC m=+3021.644390808" observedRunningTime="2025-11-22 03:44:17.91923945 +0000 UTC m=+3022.225256733" watchObservedRunningTime="2025-11-22 03:44:17.922135696 +0000 UTC m=+3022.228152989" Nov 22 03:44:19 crc kubenswrapper[4952]: I1122 03:44:19.530645 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:44:19 crc kubenswrapper[4952]: E1122 03:44:19.531763 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:44:21 crc kubenswrapper[4952]: I1122 03:44:21.941113 4952 generic.go:334] "Generic (PLEG): container finished" podID="80e23ef1-5ca7-4b59-a4a7-17586e0a1989" containerID="493de238a2bae7b1a3bb35ec06e796f60617e50cab5f9442f8bd09fa2375ba0c" exitCode=0 Nov 22 03:44:21 crc kubenswrapper[4952]: I1122 03:44:21.941194 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" event={"ID":"80e23ef1-5ca7-4b59-a4a7-17586e0a1989","Type":"ContainerDied","Data":"493de238a2bae7b1a3bb35ec06e796f60617e50cab5f9442f8bd09fa2375ba0c"} Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.402563 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.513464 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-migration-ssh-key-0\") pod \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.513594 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-ceph-nova-0\") pod \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.513643 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j66ph\" (UniqueName: \"kubernetes.io/projected/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-kube-api-access-j66ph\") pod \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.513673 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-ceph\") pod \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.513782 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-cell1-compute-config-1\") pod \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.513815 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-cell1-compute-config-0\") pod \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.513843 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-custom-ceph-combined-ca-bundle\") pod \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.513867 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-extra-config-0\") pod \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.513945 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-inventory\") pod \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.513991 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-ssh-key\") pod \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.514026 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-migration-ssh-key-1\") pod \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\" (UID: \"80e23ef1-5ca7-4b59-a4a7-17586e0a1989\") " Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.520762 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-ceph" (OuterVolumeSpecName: "ceph") pod "80e23ef1-5ca7-4b59-a4a7-17586e0a1989" (UID: "80e23ef1-5ca7-4b59-a4a7-17586e0a1989"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.521346 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "80e23ef1-5ca7-4b59-a4a7-17586e0a1989" (UID: "80e23ef1-5ca7-4b59-a4a7-17586e0a1989"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.521462 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-kube-api-access-j66ph" (OuterVolumeSpecName: "kube-api-access-j66ph") pod "80e23ef1-5ca7-4b59-a4a7-17586e0a1989" (UID: "80e23ef1-5ca7-4b59-a4a7-17586e0a1989"). InnerVolumeSpecName "kube-api-access-j66ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.542622 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "80e23ef1-5ca7-4b59-a4a7-17586e0a1989" (UID: "80e23ef1-5ca7-4b59-a4a7-17586e0a1989"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.545426 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "80e23ef1-5ca7-4b59-a4a7-17586e0a1989" (UID: "80e23ef1-5ca7-4b59-a4a7-17586e0a1989"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.546525 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "80e23ef1-5ca7-4b59-a4a7-17586e0a1989" (UID: "80e23ef1-5ca7-4b59-a4a7-17586e0a1989"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.546539 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "80e23ef1-5ca7-4b59-a4a7-17586e0a1989" (UID: "80e23ef1-5ca7-4b59-a4a7-17586e0a1989"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.552809 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "80e23ef1-5ca7-4b59-a4a7-17586e0a1989" (UID: "80e23ef1-5ca7-4b59-a4a7-17586e0a1989"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.554910 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-inventory" (OuterVolumeSpecName: "inventory") pod "80e23ef1-5ca7-4b59-a4a7-17586e0a1989" (UID: "80e23ef1-5ca7-4b59-a4a7-17586e0a1989"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.555510 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "80e23ef1-5ca7-4b59-a4a7-17586e0a1989" (UID: "80e23ef1-5ca7-4b59-a4a7-17586e0a1989"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.556188 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "80e23ef1-5ca7-4b59-a4a7-17586e0a1989" (UID: "80e23ef1-5ca7-4b59-a4a7-17586e0a1989"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.616129 4952 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.616375 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.616496 4952 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.616526 4952 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.616567 4952 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.617491 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j66ph\" (UniqueName: \"kubernetes.io/projected/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-kube-api-access-j66ph\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.617589 4952 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.617679 4952 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.617779 4952 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.617849 4952 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.617935 4952 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/80e23ef1-5ca7-4b59-a4a7-17586e0a1989-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.964340 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" event={"ID":"80e23ef1-5ca7-4b59-a4a7-17586e0a1989","Type":"ContainerDied","Data":"ba52c10cb4ab5ce8bc561b3f97e04afc83c7ff3e3a5bbe7017c2e62dacf475f0"} Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.964422 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba52c10cb4ab5ce8bc561b3f97e04afc83c7ff3e3a5bbe7017c2e62dacf475f0" Nov 22 03:44:23 crc kubenswrapper[4952]: I1122 03:44:23.964516 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz" Nov 22 03:44:24 crc kubenswrapper[4952]: I1122 03:44:24.447118 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qc8vt" Nov 22 03:44:24 crc kubenswrapper[4952]: I1122 03:44:24.447402 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qc8vt" Nov 22 03:44:24 crc kubenswrapper[4952]: I1122 03:44:24.547372 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qc8vt" Nov 22 03:44:25 crc kubenswrapper[4952]: I1122 03:44:25.036290 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qc8vt" Nov 22 03:44:25 crc kubenswrapper[4952]: I1122 03:44:25.094664 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qc8vt"] Nov 22 03:44:26 crc kubenswrapper[4952]: I1122 03:44:26.999214 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qc8vt" podUID="4debd57c-fc7c-43b3-bab6-85aa3d6185a5" containerName="registry-server" containerID="cri-o://3fd8ab46c5f1b3fc8d929b41d8370a61d1f7230bacb6773799f8a57573e328ff" gracePeriod=2 Nov 22 03:44:28 crc kubenswrapper[4952]: I1122 03:44:28.012727 4952 generic.go:334] "Generic (PLEG): container finished" podID="4debd57c-fc7c-43b3-bab6-85aa3d6185a5" containerID="3fd8ab46c5f1b3fc8d929b41d8370a61d1f7230bacb6773799f8a57573e328ff" exitCode=0 Nov 22 03:44:28 crc kubenswrapper[4952]: I1122 03:44:28.012791 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qc8vt" event={"ID":"4debd57c-fc7c-43b3-bab6-85aa3d6185a5","Type":"ContainerDied","Data":"3fd8ab46c5f1b3fc8d929b41d8370a61d1f7230bacb6773799f8a57573e328ff"} Nov 22 03:44:28 crc kubenswrapper[4952]: I1122 03:44:28.013246 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qc8vt" event={"ID":"4debd57c-fc7c-43b3-bab6-85aa3d6185a5","Type":"ContainerDied","Data":"829b9a70347ce4d75db3d00fa2fe23f23fe2b0605b25e8016a5e2f4df4426486"} Nov 22 03:44:28 crc kubenswrapper[4952]: I1122 03:44:28.013277 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="829b9a70347ce4d75db3d00fa2fe23f23fe2b0605b25e8016a5e2f4df4426486" Nov 22 03:44:28 crc kubenswrapper[4952]: I1122 03:44:28.038370 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qc8vt" Nov 22 03:44:28 crc kubenswrapper[4952]: I1122 03:44:28.120959 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjs8q\" (UniqueName: \"kubernetes.io/projected/4debd57c-fc7c-43b3-bab6-85aa3d6185a5-kube-api-access-cjs8q\") pod \"4debd57c-fc7c-43b3-bab6-85aa3d6185a5\" (UID: \"4debd57c-fc7c-43b3-bab6-85aa3d6185a5\") " Nov 22 03:44:28 crc kubenswrapper[4952]: I1122 03:44:28.121055 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4debd57c-fc7c-43b3-bab6-85aa3d6185a5-utilities\") pod \"4debd57c-fc7c-43b3-bab6-85aa3d6185a5\" (UID: \"4debd57c-fc7c-43b3-bab6-85aa3d6185a5\") " Nov 22 03:44:28 crc kubenswrapper[4952]: I1122 03:44:28.121097 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4debd57c-fc7c-43b3-bab6-85aa3d6185a5-catalog-content\") pod \"4debd57c-fc7c-43b3-bab6-85aa3d6185a5\" (UID: \"4debd57c-fc7c-43b3-bab6-85aa3d6185a5\") " Nov 22 03:44:28 crc kubenswrapper[4952]: I1122 03:44:28.122348 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4debd57c-fc7c-43b3-bab6-85aa3d6185a5-utilities" (OuterVolumeSpecName: "utilities") pod "4debd57c-fc7c-43b3-bab6-85aa3d6185a5" (UID: "4debd57c-fc7c-43b3-bab6-85aa3d6185a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:44:28 crc kubenswrapper[4952]: I1122 03:44:28.128404 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4debd57c-fc7c-43b3-bab6-85aa3d6185a5-kube-api-access-cjs8q" (OuterVolumeSpecName: "kube-api-access-cjs8q") pod "4debd57c-fc7c-43b3-bab6-85aa3d6185a5" (UID: "4debd57c-fc7c-43b3-bab6-85aa3d6185a5"). InnerVolumeSpecName "kube-api-access-cjs8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:44:28 crc kubenswrapper[4952]: I1122 03:44:28.182728 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4debd57c-fc7c-43b3-bab6-85aa3d6185a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4debd57c-fc7c-43b3-bab6-85aa3d6185a5" (UID: "4debd57c-fc7c-43b3-bab6-85aa3d6185a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:44:28 crc kubenswrapper[4952]: I1122 03:44:28.222771 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjs8q\" (UniqueName: \"kubernetes.io/projected/4debd57c-fc7c-43b3-bab6-85aa3d6185a5-kube-api-access-cjs8q\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:28 crc kubenswrapper[4952]: I1122 03:44:28.222811 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4debd57c-fc7c-43b3-bab6-85aa3d6185a5-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:28 crc kubenswrapper[4952]: I1122 03:44:28.222821 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4debd57c-fc7c-43b3-bab6-85aa3d6185a5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:29 crc kubenswrapper[4952]: I1122 03:44:29.029397 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qc8vt" Nov 22 03:44:29 crc kubenswrapper[4952]: I1122 03:44:29.073282 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qc8vt"] Nov 22 03:44:29 crc kubenswrapper[4952]: I1122 03:44:29.079241 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qc8vt"] Nov 22 03:44:30 crc kubenswrapper[4952]: I1122 03:44:30.550516 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4debd57c-fc7c-43b3-bab6-85aa3d6185a5" path="/var/lib/kubelet/pods/4debd57c-fc7c-43b3-bab6-85aa3d6185a5/volumes" Nov 22 03:44:33 crc kubenswrapper[4952]: I1122 03:44:33.531826 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:44:33 crc kubenswrapper[4952]: E1122 03:44:33.532812 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:44:37 crc kubenswrapper[4952]: I1122 03:44:37.919700 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Nov 22 03:44:37 crc kubenswrapper[4952]: E1122 03:44:37.936776 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4debd57c-fc7c-43b3-bab6-85aa3d6185a5" containerName="registry-server" Nov 22 03:44:37 crc kubenswrapper[4952]: I1122 03:44:37.936820 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="4debd57c-fc7c-43b3-bab6-85aa3d6185a5" containerName="registry-server" Nov 22 03:44:37 crc kubenswrapper[4952]: E1122 03:44:37.936845 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4debd57c-fc7c-43b3-bab6-85aa3d6185a5" containerName="extract-utilities" Nov 22 03:44:37 crc kubenswrapper[4952]: I1122 03:44:37.936852 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="4debd57c-fc7c-43b3-bab6-85aa3d6185a5" containerName="extract-utilities" Nov 22 03:44:37 crc kubenswrapper[4952]: E1122 03:44:37.936874 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4debd57c-fc7c-43b3-bab6-85aa3d6185a5" containerName="extract-content" Nov 22 03:44:37 crc kubenswrapper[4952]: I1122 03:44:37.936882 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="4debd57c-fc7c-43b3-bab6-85aa3d6185a5" containerName="extract-content" Nov 22 03:44:37 crc kubenswrapper[4952]: E1122 03:44:37.936917 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80e23ef1-5ca7-4b59-a4a7-17586e0a1989" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Nov 22 03:44:37 crc kubenswrapper[4952]: I1122 03:44:37.936925 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="80e23ef1-5ca7-4b59-a4a7-17586e0a1989" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Nov 22 03:44:37 crc kubenswrapper[4952]: I1122 03:44:37.937266 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="4debd57c-fc7c-43b3-bab6-85aa3d6185a5" containerName="registry-server" Nov 22 03:44:37 crc kubenswrapper[4952]: I1122 03:44:37.937279 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="80e23ef1-5ca7-4b59-a4a7-17586e0a1989" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Nov 22 03:44:37 crc kubenswrapper[4952]: I1122 03:44:37.942343 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Nov 22 03:44:37 crc kubenswrapper[4952]: I1122 03:44:37.949455 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 22 03:44:37 crc kubenswrapper[4952]: I1122 03:44:37.949468 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Nov 22 03:44:37 crc kubenswrapper[4952]: I1122 03:44:37.949773 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:44:37 crc kubenswrapper[4952]: I1122 03:44:37.953428 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:37 crc kubenswrapper[4952]: I1122 03:44:37.968075 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 22 03:44:37 crc kubenswrapper[4952]: I1122 03:44:37.972975 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Nov 22 03:44:37 crc kubenswrapper[4952]: I1122 03:44:37.977458 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.056410 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-run\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.056723 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-sys\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.056938 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb4vr\" (UniqueName: \"kubernetes.io/projected/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-kube-api-access-hb4vr\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.056989 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057026 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057047 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057066 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-lib-modules\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057090 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057121 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057148 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057174 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057209 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd872c8-ca07-4e06-9666-22d89916ead1-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057242 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057271 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvp8m\" (UniqueName: \"kubernetes.io/projected/6dd872c8-ca07-4e06-9666-22d89916ead1-kube-api-access-nvp8m\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057302 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd872c8-ca07-4e06-9666-22d89916ead1-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057323 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-dev\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057346 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-ceph\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057397 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-sys\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057419 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057441 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dd872c8-ca07-4e06-9666-22d89916ead1-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057459 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dd872c8-ca07-4e06-9666-22d89916ead1-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057494 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-config-data\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057513 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057577 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-run\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057602 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057637 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6dd872c8-ca07-4e06-9666-22d89916ead1-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057663 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057699 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057724 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057749 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-dev\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057774 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.057798 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-scripts\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.159389 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.159693 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.159790 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.159889 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.159533 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160015 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.159889 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160078 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd872c8-ca07-4e06-9666-22d89916ead1-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160110 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160134 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvp8m\" (UniqueName: \"kubernetes.io/projected/6dd872c8-ca07-4e06-9666-22d89916ead1-kube-api-access-nvp8m\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160159 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd872c8-ca07-4e06-9666-22d89916ead1-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160174 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-dev\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160194 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-ceph\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160211 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-sys\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160228 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160244 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dd872c8-ca07-4e06-9666-22d89916ead1-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160260 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dd872c8-ca07-4e06-9666-22d89916ead1-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160292 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-config-data\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160312 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160337 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-run\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160353 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160380 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6dd872c8-ca07-4e06-9666-22d89916ead1-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160398 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160424 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160440 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160456 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-dev\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160475 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160494 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-scripts\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160512 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-run\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160532 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-sys\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160585 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb4vr\" (UniqueName: \"kubernetes.io/projected/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-kube-api-access-hb4vr\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160609 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160635 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160649 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160663 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-lib-modules\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160726 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-lib-modules\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.160974 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.161010 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.161167 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.161735 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-run\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.161818 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-dev\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.161959 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-sys\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.162045 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-dev\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.162012 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-sys\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.162092 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.162313 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.162443 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.162579 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.162618 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-run\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.162647 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.162682 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6dd872c8-ca07-4e06-9666-22d89916ead1-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.162713 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.167509 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd872c8-ca07-4e06-9666-22d89916ead1-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.167629 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.168081 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-ceph\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.168636 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-config-data\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.170395 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-scripts\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.174939 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.175219 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dd872c8-ca07-4e06-9666-22d89916ead1-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.178808 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd872c8-ca07-4e06-9666-22d89916ead1-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.182681 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb4vr\" (UniqueName: \"kubernetes.io/projected/e99cda79-b32c-4e09-8c24-9a4eb0c934ef-kube-api-access-hb4vr\") pod \"cinder-backup-0\" (UID: \"e99cda79-b32c-4e09-8c24-9a4eb0c934ef\") " pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.182978 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dd872c8-ca07-4e06-9666-22d89916ead1-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.189283 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvp8m\" (UniqueName: \"kubernetes.io/projected/6dd872c8-ca07-4e06-9666-22d89916ead1-kube-api-access-nvp8m\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.199031 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6dd872c8-ca07-4e06-9666-22d89916ead1-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"6dd872c8-ca07-4e06-9666-22d89916ead1\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.280617 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.294323 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.490474 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-x9v6c"] Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.494438 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-x9v6c" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.512945 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-x9v6c"] Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.570196 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aab2b754-fb45-442c-8efd-feca454390f3-operator-scripts\") pod \"manila-db-create-x9v6c\" (UID: \"aab2b754-fb45-442c-8efd-feca454390f3\") " pod="openstack/manila-db-create-x9v6c" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.570302 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn2rp\" (UniqueName: \"kubernetes.io/projected/aab2b754-fb45-442c-8efd-feca454390f3-kube-api-access-sn2rp\") pod \"manila-db-create-x9v6c\" (UID: \"aab2b754-fb45-442c-8efd-feca454390f3\") " pod="openstack/manila-db-create-x9v6c" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.608597 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-f5d0-account-create-2p542"] Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.609949 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-f5d0-account-create-2p542" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.612624 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.625027 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-f5d0-account-create-2p542"] Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.665926 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-dccb8556f-cmgsg"] Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.667436 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dccb8556f-cmgsg" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.669190 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.669317 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-6jgdh" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.669584 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.670131 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.672059 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b133bfea-bd18-486b-8ab0-12bba9c84fe6-operator-scripts\") pod \"manila-f5d0-account-create-2p542\" (UID: \"b133bfea-bd18-486b-8ab0-12bba9c84fe6\") " pod="openstack/manila-f5d0-account-create-2p542" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.672115 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm8t9\" (UniqueName: \"kubernetes.io/projected/b133bfea-bd18-486b-8ab0-12bba9c84fe6-kube-api-access-nm8t9\") pod \"manila-f5d0-account-create-2p542\" (UID: \"b133bfea-bd18-486b-8ab0-12bba9c84fe6\") " pod="openstack/manila-f5d0-account-create-2p542" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.672214 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aab2b754-fb45-442c-8efd-feca454390f3-operator-scripts\") pod \"manila-db-create-x9v6c\" (UID: \"aab2b754-fb45-442c-8efd-feca454390f3\") " pod="openstack/manila-db-create-x9v6c" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.672244 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn2rp\" (UniqueName: \"kubernetes.io/projected/aab2b754-fb45-442c-8efd-feca454390f3-kube-api-access-sn2rp\") pod \"manila-db-create-x9v6c\" (UID: \"aab2b754-fb45-442c-8efd-feca454390f3\") " pod="openstack/manila-db-create-x9v6c" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.673445 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aab2b754-fb45-442c-8efd-feca454390f3-operator-scripts\") pod \"manila-db-create-x9v6c\" (UID: \"aab2b754-fb45-442c-8efd-feca454390f3\") " pod="openstack/manila-db-create-x9v6c" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.694949 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dccb8556f-cmgsg"] Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.717466 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn2rp\" (UniqueName: \"kubernetes.io/projected/aab2b754-fb45-442c-8efd-feca454390f3-kube-api-access-sn2rp\") pod \"manila-db-create-x9v6c\" (UID: \"aab2b754-fb45-442c-8efd-feca454390f3\") " pod="openstack/manila-db-create-x9v6c" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.760927 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-69d47b8cc7-k9qjl"] Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.762442 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69d47b8cc7-k9qjl" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.776383 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvcmf\" (UniqueName: \"kubernetes.io/projected/d455bb45-949a-444f-bf5f-61736fbe9c28-kube-api-access-mvcmf\") pod \"horizon-dccb8556f-cmgsg\" (UID: \"d455bb45-949a-444f-bf5f-61736fbe9c28\") " pod="openstack/horizon-dccb8556f-cmgsg" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.776429 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm8t9\" (UniqueName: \"kubernetes.io/projected/b133bfea-bd18-486b-8ab0-12bba9c84fe6-kube-api-access-nm8t9\") pod \"manila-f5d0-account-create-2p542\" (UID: \"b133bfea-bd18-486b-8ab0-12bba9c84fe6\") " pod="openstack/manila-f5d0-account-create-2p542" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.776460 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d455bb45-949a-444f-bf5f-61736fbe9c28-config-data\") pod \"horizon-dccb8556f-cmgsg\" (UID: \"d455bb45-949a-444f-bf5f-61736fbe9c28\") " pod="openstack/horizon-dccb8556f-cmgsg" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.776478 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d455bb45-949a-444f-bf5f-61736fbe9c28-scripts\") pod \"horizon-dccb8556f-cmgsg\" (UID: \"d455bb45-949a-444f-bf5f-61736fbe9c28\") " pod="openstack/horizon-dccb8556f-cmgsg" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.776503 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d455bb45-949a-444f-bf5f-61736fbe9c28-horizon-secret-key\") pod \"horizon-dccb8556f-cmgsg\" (UID: \"d455bb45-949a-444f-bf5f-61736fbe9c28\") " pod="openstack/horizon-dccb8556f-cmgsg" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.776591 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d455bb45-949a-444f-bf5f-61736fbe9c28-logs\") pod \"horizon-dccb8556f-cmgsg\" (UID: \"d455bb45-949a-444f-bf5f-61736fbe9c28\") " pod="openstack/horizon-dccb8556f-cmgsg" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.776649 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b133bfea-bd18-486b-8ab0-12bba9c84fe6-operator-scripts\") pod \"manila-f5d0-account-create-2p542\" (UID: \"b133bfea-bd18-486b-8ab0-12bba9c84fe6\") " pod="openstack/manila-f5d0-account-create-2p542" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.777327 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b133bfea-bd18-486b-8ab0-12bba9c84fe6-operator-scripts\") pod \"manila-f5d0-account-create-2p542\" (UID: \"b133bfea-bd18-486b-8ab0-12bba9c84fe6\") " pod="openstack/manila-f5d0-account-create-2p542" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.782277 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.784103 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.802591 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.803219 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mtlf4" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.803450 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.804019 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.804362 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-69d47b8cc7-k9qjl"] Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.818133 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm8t9\" (UniqueName: \"kubernetes.io/projected/b133bfea-bd18-486b-8ab0-12bba9c84fe6-kube-api-access-nm8t9\") pod \"manila-f5d0-account-create-2p542\" (UID: \"b133bfea-bd18-486b-8ab0-12bba9c84fe6\") " pod="openstack/manila-f5d0-account-create-2p542" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.820025 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.821048 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-x9v6c" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.878597 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.878645 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.878667 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hnsg\" (UniqueName: \"kubernetes.io/projected/17666473-50ea-48ef-afc8-265daec6df33-kube-api-access-5hnsg\") pod \"horizon-69d47b8cc7-k9qjl\" (UID: \"17666473-50ea-48ef-afc8-265daec6df33\") " pod="openstack/horizon-69d47b8cc7-k9qjl" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.878705 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.878737 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17666473-50ea-48ef-afc8-265daec6df33-scripts\") pod \"horizon-69d47b8cc7-k9qjl\" (UID: \"17666473-50ea-48ef-afc8-265daec6df33\") " pod="openstack/horizon-69d47b8cc7-k9qjl" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.878756 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-config-data\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.878792 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvcmf\" (UniqueName: \"kubernetes.io/projected/d455bb45-949a-444f-bf5f-61736fbe9c28-kube-api-access-mvcmf\") pod \"horizon-dccb8556f-cmgsg\" (UID: \"d455bb45-949a-444f-bf5f-61736fbe9c28\") " pod="openstack/horizon-dccb8556f-cmgsg" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.878812 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-scripts\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.878836 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d455bb45-949a-444f-bf5f-61736fbe9c28-scripts\") pod \"horizon-dccb8556f-cmgsg\" (UID: \"d455bb45-949a-444f-bf5f-61736fbe9c28\") " pod="openstack/horizon-dccb8556f-cmgsg" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.878854 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d455bb45-949a-444f-bf5f-61736fbe9c28-config-data\") pod \"horizon-dccb8556f-cmgsg\" (UID: \"d455bb45-949a-444f-bf5f-61736fbe9c28\") " pod="openstack/horizon-dccb8556f-cmgsg" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.878873 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17666473-50ea-48ef-afc8-265daec6df33-config-data\") pod \"horizon-69d47b8cc7-k9qjl\" (UID: \"17666473-50ea-48ef-afc8-265daec6df33\") " pod="openstack/horizon-69d47b8cc7-k9qjl" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.878895 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d455bb45-949a-444f-bf5f-61736fbe9c28-horizon-secret-key\") pod \"horizon-dccb8556f-cmgsg\" (UID: \"d455bb45-949a-444f-bf5f-61736fbe9c28\") " pod="openstack/horizon-dccb8556f-cmgsg" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.878926 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-ceph\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.878947 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.878963 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmk8d\" (UniqueName: \"kubernetes.io/projected/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-kube-api-access-bmk8d\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.878982 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17666473-50ea-48ef-afc8-265daec6df33-logs\") pod \"horizon-69d47b8cc7-k9qjl\" (UID: \"17666473-50ea-48ef-afc8-265daec6df33\") " pod="openstack/horizon-69d47b8cc7-k9qjl" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.879013 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/17666473-50ea-48ef-afc8-265daec6df33-horizon-secret-key\") pod \"horizon-69d47b8cc7-k9qjl\" (UID: \"17666473-50ea-48ef-afc8-265daec6df33\") " pod="openstack/horizon-69d47b8cc7-k9qjl" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.879032 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-logs\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.879057 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d455bb45-949a-444f-bf5f-61736fbe9c28-logs\") pod \"horizon-dccb8556f-cmgsg\" (UID: \"d455bb45-949a-444f-bf5f-61736fbe9c28\") " pod="openstack/horizon-dccb8556f-cmgsg" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.879432 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d455bb45-949a-444f-bf5f-61736fbe9c28-logs\") pod \"horizon-dccb8556f-cmgsg\" (UID: \"d455bb45-949a-444f-bf5f-61736fbe9c28\") " pod="openstack/horizon-dccb8556f-cmgsg" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.880195 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d455bb45-949a-444f-bf5f-61736fbe9c28-scripts\") pod \"horizon-dccb8556f-cmgsg\" (UID: \"d455bb45-949a-444f-bf5f-61736fbe9c28\") " pod="openstack/horizon-dccb8556f-cmgsg" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.885692 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d455bb45-949a-444f-bf5f-61736fbe9c28-horizon-secret-key\") pod \"horizon-dccb8556f-cmgsg\" (UID: \"d455bb45-949a-444f-bf5f-61736fbe9c28\") " pod="openstack/horizon-dccb8556f-cmgsg" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.887736 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d455bb45-949a-444f-bf5f-61736fbe9c28-config-data\") pod \"horizon-dccb8556f-cmgsg\" (UID: \"d455bb45-949a-444f-bf5f-61736fbe9c28\") " pod="openstack/horizon-dccb8556f-cmgsg" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.894050 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.895623 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.899308 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.899572 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.907307 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.907849 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvcmf\" (UniqueName: \"kubernetes.io/projected/d455bb45-949a-444f-bf5f-61736fbe9c28-kube-api-access-mvcmf\") pod \"horizon-dccb8556f-cmgsg\" (UID: \"d455bb45-949a-444f-bf5f-61736fbe9c28\") " pod="openstack/horizon-dccb8556f-cmgsg" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.950428 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-f5d0-account-create-2p542" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.976159 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.983871 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.983911 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.983947 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.984161 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17666473-50ea-48ef-afc8-265daec6df33-scripts\") pod \"horizon-69d47b8cc7-k9qjl\" (UID: \"17666473-50ea-48ef-afc8-265daec6df33\") " pod="openstack/horizon-69d47b8cc7-k9qjl" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.985216 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17666473-50ea-48ef-afc8-265daec6df33-scripts\") pod \"horizon-69d47b8cc7-k9qjl\" (UID: \"17666473-50ea-48ef-afc8-265daec6df33\") " pod="openstack/horizon-69d47b8cc7-k9qjl" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.985307 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-config-data\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.985612 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-scripts\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.985685 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17666473-50ea-48ef-afc8-265daec6df33-config-data\") pod \"horizon-69d47b8cc7-k9qjl\" (UID: \"17666473-50ea-48ef-afc8-265daec6df33\") " pod="openstack/horizon-69d47b8cc7-k9qjl" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.985795 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-ceph\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.985842 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.985873 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmk8d\" (UniqueName: \"kubernetes.io/projected/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-kube-api-access-bmk8d\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.985894 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.985913 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17666473-50ea-48ef-afc8-265daec6df33-logs\") pod \"horizon-69d47b8cc7-k9qjl\" (UID: \"17666473-50ea-48ef-afc8-265daec6df33\") " pod="openstack/horizon-69d47b8cc7-k9qjl" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.985926 4952 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.985947 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8b17cec-6979-496c-8e36-40060c31ed82-logs\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.985995 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/17666473-50ea-48ef-afc8-265daec6df33-horizon-secret-key\") pod \"horizon-69d47b8cc7-k9qjl\" (UID: \"17666473-50ea-48ef-afc8-265daec6df33\") " pod="openstack/horizon-69d47b8cc7-k9qjl" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.986023 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-logs\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.986089 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b8b17cec-6979-496c-8e36-40060c31ed82-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.986107 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.986137 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b8b17cec-6979-496c-8e36-40060c31ed82-ceph\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.986157 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.986197 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkrfc\" (UniqueName: \"kubernetes.io/projected/b8b17cec-6979-496c-8e36-40060c31ed82-kube-api-access-pkrfc\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.986234 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.986251 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hnsg\" (UniqueName: \"kubernetes.io/projected/17666473-50ea-48ef-afc8-265daec6df33-kube-api-access-5hnsg\") pod \"horizon-69d47b8cc7-k9qjl\" (UID: \"17666473-50ea-48ef-afc8-265daec6df33\") " pod="openstack/horizon-69d47b8cc7-k9qjl" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.986294 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.987354 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.987404 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-logs\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.990499 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17666473-50ea-48ef-afc8-265daec6df33-logs\") pod \"horizon-69d47b8cc7-k9qjl\" (UID: \"17666473-50ea-48ef-afc8-265daec6df33\") " pod="openstack/horizon-69d47b8cc7-k9qjl" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.992323 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17666473-50ea-48ef-afc8-265daec6df33-config-data\") pod \"horizon-69d47b8cc7-k9qjl\" (UID: \"17666473-50ea-48ef-afc8-265daec6df33\") " pod="openstack/horizon-69d47b8cc7-k9qjl" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.992787 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-ceph\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.995069 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-scripts\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.995772 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/17666473-50ea-48ef-afc8-265daec6df33-horizon-secret-key\") pod \"horizon-69d47b8cc7-k9qjl\" (UID: \"17666473-50ea-48ef-afc8-265daec6df33\") " pod="openstack/horizon-69d47b8cc7-k9qjl" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.998926 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.999017 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-config-data\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:38 crc kubenswrapper[4952]: I1122 03:44:38.999880 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.008527 4952 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.013808 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hnsg\" (UniqueName: \"kubernetes.io/projected/17666473-50ea-48ef-afc8-265daec6df33-kube-api-access-5hnsg\") pod \"horizon-69d47b8cc7-k9qjl\" (UID: \"17666473-50ea-48ef-afc8-265daec6df33\") " pod="openstack/horizon-69d47b8cc7-k9qjl" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.014255 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmk8d\" (UniqueName: \"kubernetes.io/projected/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-kube-api-access-bmk8d\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.018308 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dccb8556f-cmgsg" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.039594 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.087596 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b8b17cec-6979-496c-8e36-40060c31ed82-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.087747 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b8b17cec-6979-496c-8e36-40060c31ed82-ceph\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.087837 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.087915 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkrfc\" (UniqueName: \"kubernetes.io/projected/b8b17cec-6979-496c-8e36-40060c31ed82-kube-api-access-pkrfc\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.087998 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.088074 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.088147 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.088315 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.088399 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8b17cec-6979-496c-8e36-40060c31ed82-logs\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.103097 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b8b17cec-6979-496c-8e36-40060c31ed82-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.103587 4952 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.105263 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8b17cec-6979-496c-8e36-40060c31ed82-logs\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.108302 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.108638 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b8b17cec-6979-496c-8e36-40060c31ed82-ceph\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.118713 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.121407 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69d47b8cc7-k9qjl" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.128755 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.131073 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.136745 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.141008 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.155255 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkrfc\" (UniqueName: \"kubernetes.io/projected/b8b17cec-6979-496c-8e36-40060c31ed82-kube-api-access-pkrfc\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.160773 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"6dd872c8-ca07-4e06-9666-22d89916ead1","Type":"ContainerStarted","Data":"0a2eef79d660f9ba6065d04962c308f73e5f98f90c4b3ec442714a533d909034"} Nov 22 03:44:39 crc kubenswrapper[4952]: W1122 03:44:39.171696 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode99cda79_b32c_4e09_8c24_9a4eb0c934ef.slice/crio-8b13da7e714ebcac538f12eed41d3c1df64ca8fb11d5fdb80177e42a14c16015 WatchSource:0}: Error finding container 8b13da7e714ebcac538f12eed41d3c1df64ca8fb11d5fdb80177e42a14c16015: Status 404 returned error can't find the container with id 8b13da7e714ebcac538f12eed41d3c1df64ca8fb11d5fdb80177e42a14c16015 Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.174105 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.222078 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.420099 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-f5d0-account-create-2p542"] Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.478972 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-x9v6c"] Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.624712 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dccb8556f-cmgsg"] Nov 22 03:44:39 crc kubenswrapper[4952]: I1122 03:44:39.992685 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 03:44:40 crc kubenswrapper[4952]: I1122 03:44:40.008008 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-69d47b8cc7-k9qjl"] Nov 22 03:44:40 crc kubenswrapper[4952]: I1122 03:44:40.094385 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 03:44:40 crc kubenswrapper[4952]: W1122 03:44:40.150678 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17666473_50ea_48ef_afc8_265daec6df33.slice/crio-9af805f778982576c093a678bd31163cd06118fa6114d6d8b610806b0e607241 WatchSource:0}: Error finding container 9af805f778982576c093a678bd31163cd06118fa6114d6d8b610806b0e607241: Status 404 returned error can't find the container with id 9af805f778982576c093a678bd31163cd06118fa6114d6d8b610806b0e607241 Nov 22 03:44:40 crc kubenswrapper[4952]: I1122 03:44:40.173581 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b8b17cec-6979-496c-8e36-40060c31ed82","Type":"ContainerStarted","Data":"cc5298b811d0c584151d4c8c1387f1fc00ccae31b892cc82dda43d0a0fe943ea"} Nov 22 03:44:40 crc kubenswrapper[4952]: I1122 03:44:40.178853 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dccb8556f-cmgsg" event={"ID":"d455bb45-949a-444f-bf5f-61736fbe9c28","Type":"ContainerStarted","Data":"10f01e8070ab5448bfafcdf4e552c8cb82e83e9e8b37dad06243c72cd11c9284"} Nov 22 03:44:40 crc kubenswrapper[4952]: I1122 03:44:40.180212 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce","Type":"ContainerStarted","Data":"eb79dab2161b79e88c6fa19d8eb06e32c96ef82ef02da5f31123e7392db7a78d"} Nov 22 03:44:40 crc kubenswrapper[4952]: I1122 03:44:40.181621 4952 generic.go:334] "Generic (PLEG): container finished" podID="b133bfea-bd18-486b-8ab0-12bba9c84fe6" containerID="89feb0c52b8664344ac195a1d423b135377ff8bb6d3372a9d3ded78920987224" exitCode=0 Nov 22 03:44:40 crc kubenswrapper[4952]: I1122 03:44:40.181684 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-f5d0-account-create-2p542" event={"ID":"b133bfea-bd18-486b-8ab0-12bba9c84fe6","Type":"ContainerDied","Data":"89feb0c52b8664344ac195a1d423b135377ff8bb6d3372a9d3ded78920987224"} Nov 22 03:44:40 crc kubenswrapper[4952]: I1122 03:44:40.181720 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-f5d0-account-create-2p542" event={"ID":"b133bfea-bd18-486b-8ab0-12bba9c84fe6","Type":"ContainerStarted","Data":"588b7e94ea075c2b71c9605a7eef98e6e9f87973e3e624ce470d45edb587a32b"} Nov 22 03:44:40 crc kubenswrapper[4952]: I1122 03:44:40.182730 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d47b8cc7-k9qjl" event={"ID":"17666473-50ea-48ef-afc8-265daec6df33","Type":"ContainerStarted","Data":"9af805f778982576c093a678bd31163cd06118fa6114d6d8b610806b0e607241"} Nov 22 03:44:40 crc kubenswrapper[4952]: I1122 03:44:40.183873 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e99cda79-b32c-4e09-8c24-9a4eb0c934ef","Type":"ContainerStarted","Data":"8b13da7e714ebcac538f12eed41d3c1df64ca8fb11d5fdb80177e42a14c16015"} Nov 22 03:44:40 crc kubenswrapper[4952]: I1122 03:44:40.185472 4952 generic.go:334] "Generic (PLEG): container finished" podID="aab2b754-fb45-442c-8efd-feca454390f3" containerID="91121fe4e73b57946e9dd61ef4c9639fee20c38f9d4fef72738fd1645a7fed06" exitCode=0 Nov 22 03:44:40 crc kubenswrapper[4952]: I1122 03:44:40.185531 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-x9v6c" event={"ID":"aab2b754-fb45-442c-8efd-feca454390f3","Type":"ContainerDied","Data":"91121fe4e73b57946e9dd61ef4c9639fee20c38f9d4fef72738fd1645a7fed06"} Nov 22 03:44:40 crc kubenswrapper[4952]: I1122 03:44:40.185580 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-x9v6c" event={"ID":"aab2b754-fb45-442c-8efd-feca454390f3","Type":"ContainerStarted","Data":"7113f19c52dc6597531764cd1af299b10aa2f6cfc6ea646e3ac3deb42a4155c1"} Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.240903 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-69d47b8cc7-k9qjl"] Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.267956 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"6dd872c8-ca07-4e06-9666-22d89916ead1","Type":"ContainerStarted","Data":"ff9be9f0c71ebc8a69c1d81354c5384f36d221805cb46522cd3d854b7715d2ac"} Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.268009 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"6dd872c8-ca07-4e06-9666-22d89916ead1","Type":"ContainerStarted","Data":"bb72c809cfd3378468bb08ca8c7f08eda5c73d96bb78a816c95279c280771a4a"} Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.329533 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e99cda79-b32c-4e09-8c24-9a4eb0c934ef","Type":"ContainerStarted","Data":"cbeb47750a7e1421dc34e09feb4f01550dc06dd4fbaa306b7780626632a7feca"} Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.329598 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e99cda79-b32c-4e09-8c24-9a4eb0c934ef","Type":"ContainerStarted","Data":"92fb67582c949ecf811f413075cba86b6003f01362ac63ab2227cb0946bfe3d2"} Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.334772 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5675778f5b-wg7px"] Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.337157 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.357466 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.361125 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b8b17cec-6979-496c-8e36-40060c31ed82","Type":"ContainerStarted","Data":"fd57823adc23072fab45f6b18e1808582e349460f56b3213ac7cf7bab23ec150"} Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.365689 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5675778f5b-wg7px"] Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.371207 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8eb2643-4d7c-4814-91fe-1192d3fc753d-combined-ca-bundle\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.371263 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8eb2643-4d7c-4814-91fe-1192d3fc753d-logs\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.371337 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d8eb2643-4d7c-4814-91fe-1192d3fc753d-horizon-secret-key\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.371395 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8eb2643-4d7c-4814-91fe-1192d3fc753d-scripts\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.371846 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8eb2643-4d7c-4814-91fe-1192d3fc753d-horizon-tls-certs\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.371948 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8eb2643-4d7c-4814-91fe-1192d3fc753d-config-data\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.372028 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgq8l\" (UniqueName: \"kubernetes.io/projected/d8eb2643-4d7c-4814-91fe-1192d3fc753d-kube-api-access-zgq8l\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.384719 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce","Type":"ContainerStarted","Data":"28de63ae3c9677796784c457a210ca34cb92d3170ae937bb314f2706ff8e7b66"} Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.396788 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-dccb8556f-cmgsg"] Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.425850 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.432014 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.21692953 podStartE2EDuration="4.431984851s" podCreationTimestamp="2025-11-22 03:44:37 +0000 UTC" firstStartedPulling="2025-11-22 03:44:39.007784383 +0000 UTC m=+3043.313801656" lastFinishedPulling="2025-11-22 03:44:40.222839704 +0000 UTC m=+3044.528856977" observedRunningTime="2025-11-22 03:44:41.299409594 +0000 UTC m=+3045.605426867" watchObservedRunningTime="2025-11-22 03:44:41.431984851 +0000 UTC m=+3045.738002124" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.454241 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-56848f9f44-m7z42"] Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.456533 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.463295 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56848f9f44-m7z42"] Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.469157 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.451973572 podStartE2EDuration="4.469137176s" podCreationTimestamp="2025-11-22 03:44:37 +0000 UTC" firstStartedPulling="2025-11-22 03:44:39.177206126 +0000 UTC m=+3043.483223399" lastFinishedPulling="2025-11-22 03:44:40.19436972 +0000 UTC m=+3044.500387003" observedRunningTime="2025-11-22 03:44:41.382341124 +0000 UTC m=+3045.688358397" watchObservedRunningTime="2025-11-22 03:44:41.469137176 +0000 UTC m=+3045.775154449" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.480963 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea3db97f-72b6-4eaa-b8ea-256a5691008f-config-data\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.481037 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea3db97f-72b6-4eaa-b8ea-256a5691008f-scripts\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.481085 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8eb2643-4d7c-4814-91fe-1192d3fc753d-horizon-tls-certs\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.481116 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea3db97f-72b6-4eaa-b8ea-256a5691008f-horizon-secret-key\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.481221 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8eb2643-4d7c-4814-91fe-1192d3fc753d-config-data\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.481316 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea3db97f-72b6-4eaa-b8ea-256a5691008f-horizon-tls-certs\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.481337 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgq8l\" (UniqueName: \"kubernetes.io/projected/d8eb2643-4d7c-4814-91fe-1192d3fc753d-kube-api-access-zgq8l\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.481397 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn5l6\" (UniqueName: \"kubernetes.io/projected/ea3db97f-72b6-4eaa-b8ea-256a5691008f-kube-api-access-nn5l6\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.481451 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea3db97f-72b6-4eaa-b8ea-256a5691008f-logs\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.481564 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8eb2643-4d7c-4814-91fe-1192d3fc753d-combined-ca-bundle\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.481584 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3db97f-72b6-4eaa-b8ea-256a5691008f-combined-ca-bundle\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.481615 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8eb2643-4d7c-4814-91fe-1192d3fc753d-logs\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.481679 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d8eb2643-4d7c-4814-91fe-1192d3fc753d-horizon-secret-key\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.481727 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8eb2643-4d7c-4814-91fe-1192d3fc753d-scripts\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.482636 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8eb2643-4d7c-4814-91fe-1192d3fc753d-scripts\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.485329 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8eb2643-4d7c-4814-91fe-1192d3fc753d-config-data\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.486420 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8eb2643-4d7c-4814-91fe-1192d3fc753d-logs\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.491076 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d8eb2643-4d7c-4814-91fe-1192d3fc753d-horizon-secret-key\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.515774 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8eb2643-4d7c-4814-91fe-1192d3fc753d-combined-ca-bundle\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.517672 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8eb2643-4d7c-4814-91fe-1192d3fc753d-horizon-tls-certs\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.529972 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgq8l\" (UniqueName: \"kubernetes.io/projected/d8eb2643-4d7c-4814-91fe-1192d3fc753d-kube-api-access-zgq8l\") pod \"horizon-5675778f5b-wg7px\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.549320 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.584280 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3db97f-72b6-4eaa-b8ea-256a5691008f-combined-ca-bundle\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.584415 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea3db97f-72b6-4eaa-b8ea-256a5691008f-config-data\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.584442 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea3db97f-72b6-4eaa-b8ea-256a5691008f-scripts\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.584467 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea3db97f-72b6-4eaa-b8ea-256a5691008f-horizon-secret-key\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.584560 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea3db97f-72b6-4eaa-b8ea-256a5691008f-horizon-tls-certs\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.584588 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn5l6\" (UniqueName: \"kubernetes.io/projected/ea3db97f-72b6-4eaa-b8ea-256a5691008f-kube-api-access-nn5l6\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.584629 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea3db97f-72b6-4eaa-b8ea-256a5691008f-logs\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.585105 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea3db97f-72b6-4eaa-b8ea-256a5691008f-logs\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.588294 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3db97f-72b6-4eaa-b8ea-256a5691008f-combined-ca-bundle\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.589417 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea3db97f-72b6-4eaa-b8ea-256a5691008f-config-data\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.589823 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea3db97f-72b6-4eaa-b8ea-256a5691008f-scripts\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.593396 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea3db97f-72b6-4eaa-b8ea-256a5691008f-horizon-tls-certs\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.595793 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea3db97f-72b6-4eaa-b8ea-256a5691008f-horizon-secret-key\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.618984 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn5l6\" (UniqueName: \"kubernetes.io/projected/ea3db97f-72b6-4eaa-b8ea-256a5691008f-kube-api-access-nn5l6\") pod \"horizon-56848f9f44-m7z42\" (UID: \"ea3db97f-72b6-4eaa-b8ea-256a5691008f\") " pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.697203 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.710496 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.848647 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-f5d0-account-create-2p542" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.855883 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-x9v6c" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.890235 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm8t9\" (UniqueName: \"kubernetes.io/projected/b133bfea-bd18-486b-8ab0-12bba9c84fe6-kube-api-access-nm8t9\") pod \"b133bfea-bd18-486b-8ab0-12bba9c84fe6\" (UID: \"b133bfea-bd18-486b-8ab0-12bba9c84fe6\") " Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.890378 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn2rp\" (UniqueName: \"kubernetes.io/projected/aab2b754-fb45-442c-8efd-feca454390f3-kube-api-access-sn2rp\") pod \"aab2b754-fb45-442c-8efd-feca454390f3\" (UID: \"aab2b754-fb45-442c-8efd-feca454390f3\") " Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.890416 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b133bfea-bd18-486b-8ab0-12bba9c84fe6-operator-scripts\") pod \"b133bfea-bd18-486b-8ab0-12bba9c84fe6\" (UID: \"b133bfea-bd18-486b-8ab0-12bba9c84fe6\") " Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.890522 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aab2b754-fb45-442c-8efd-feca454390f3-operator-scripts\") pod \"aab2b754-fb45-442c-8efd-feca454390f3\" (UID: \"aab2b754-fb45-442c-8efd-feca454390f3\") " Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.892158 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aab2b754-fb45-442c-8efd-feca454390f3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aab2b754-fb45-442c-8efd-feca454390f3" (UID: "aab2b754-fb45-442c-8efd-feca454390f3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.892473 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b133bfea-bd18-486b-8ab0-12bba9c84fe6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b133bfea-bd18-486b-8ab0-12bba9c84fe6" (UID: "b133bfea-bd18-486b-8ab0-12bba9c84fe6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.896134 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aab2b754-fb45-442c-8efd-feca454390f3-kube-api-access-sn2rp" (OuterVolumeSpecName: "kube-api-access-sn2rp") pod "aab2b754-fb45-442c-8efd-feca454390f3" (UID: "aab2b754-fb45-442c-8efd-feca454390f3"). InnerVolumeSpecName "kube-api-access-sn2rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.897881 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b133bfea-bd18-486b-8ab0-12bba9c84fe6-kube-api-access-nm8t9" (OuterVolumeSpecName: "kube-api-access-nm8t9") pod "b133bfea-bd18-486b-8ab0-12bba9c84fe6" (UID: "b133bfea-bd18-486b-8ab0-12bba9c84fe6"). InnerVolumeSpecName "kube-api-access-nm8t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.993773 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn2rp\" (UniqueName: \"kubernetes.io/projected/aab2b754-fb45-442c-8efd-feca454390f3-kube-api-access-sn2rp\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.994425 4952 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b133bfea-bd18-486b-8ab0-12bba9c84fe6-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.994437 4952 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aab2b754-fb45-442c-8efd-feca454390f3-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:41 crc kubenswrapper[4952]: I1122 03:44:41.994446 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm8t9\" (UniqueName: \"kubernetes.io/projected/b133bfea-bd18-486b-8ab0-12bba9c84fe6-kube-api-access-nm8t9\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:42 crc kubenswrapper[4952]: I1122 03:44:42.414288 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-x9v6c" event={"ID":"aab2b754-fb45-442c-8efd-feca454390f3","Type":"ContainerDied","Data":"7113f19c52dc6597531764cd1af299b10aa2f6cfc6ea646e3ac3deb42a4155c1"} Nov 22 03:44:42 crc kubenswrapper[4952]: I1122 03:44:42.414812 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7113f19c52dc6597531764cd1af299b10aa2f6cfc6ea646e3ac3deb42a4155c1" Nov 22 03:44:42 crc kubenswrapper[4952]: I1122 03:44:42.414903 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-x9v6c" Nov 22 03:44:42 crc kubenswrapper[4952]: I1122 03:44:42.420682 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56848f9f44-m7z42"] Nov 22 03:44:42 crc kubenswrapper[4952]: I1122 03:44:42.429988 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b8b17cec-6979-496c-8e36-40060c31ed82","Type":"ContainerStarted","Data":"7a15769e6d4aec4af039fcac903214acfffd208ad2b150d16b025e863d65c5c6"} Nov 22 03:44:42 crc kubenswrapper[4952]: I1122 03:44:42.430179 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b8b17cec-6979-496c-8e36-40060c31ed82" containerName="glance-log" containerID="cri-o://fd57823adc23072fab45f6b18e1808582e349460f56b3213ac7cf7bab23ec150" gracePeriod=30 Nov 22 03:44:42 crc kubenswrapper[4952]: I1122 03:44:42.430744 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b8b17cec-6979-496c-8e36-40060c31ed82" containerName="glance-httpd" containerID="cri-o://7a15769e6d4aec4af039fcac903214acfffd208ad2b150d16b025e863d65c5c6" gracePeriod=30 Nov 22 03:44:42 crc kubenswrapper[4952]: I1122 03:44:42.458741 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce","Type":"ContainerStarted","Data":"6e964be88e927800a48d39d3cef1a41617d5879549ee2aa8cdfecf05e96c578a"} Nov 22 03:44:42 crc kubenswrapper[4952]: I1122 03:44:42.459074 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce" containerName="glance-log" containerID="cri-o://28de63ae3c9677796784c457a210ca34cb92d3170ae937bb314f2706ff8e7b66" gracePeriod=30 Nov 22 03:44:42 crc kubenswrapper[4952]: I1122 03:44:42.459776 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce" containerName="glance-httpd" containerID="cri-o://6e964be88e927800a48d39d3cef1a41617d5879549ee2aa8cdfecf05e96c578a" gracePeriod=30 Nov 22 03:44:42 crc kubenswrapper[4952]: I1122 03:44:42.483469 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.483450005 podStartE2EDuration="4.483450005s" podCreationTimestamp="2025-11-22 03:44:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:44:42.46669137 +0000 UTC m=+3046.772708643" watchObservedRunningTime="2025-11-22 03:44:42.483450005 +0000 UTC m=+3046.789467278" Nov 22 03:44:42 crc kubenswrapper[4952]: I1122 03:44:42.518055 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-f5d0-account-create-2p542" Nov 22 03:44:42 crc kubenswrapper[4952]: I1122 03:44:42.518151 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-f5d0-account-create-2p542" event={"ID":"b133bfea-bd18-486b-8ab0-12bba9c84fe6","Type":"ContainerDied","Data":"588b7e94ea075c2b71c9605a7eef98e6e9f87973e3e624ce470d45edb587a32b"} Nov 22 03:44:42 crc kubenswrapper[4952]: I1122 03:44:42.518220 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="588b7e94ea075c2b71c9605a7eef98e6e9f87973e3e624ce470d45edb587a32b" Nov 22 03:44:42 crc kubenswrapper[4952]: I1122 03:44:42.560565 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.5605106079999995 podStartE2EDuration="4.560510608s" podCreationTimestamp="2025-11-22 03:44:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:44:42.526468976 +0000 UTC m=+3046.832486249" watchObservedRunningTime="2025-11-22 03:44:42.560510608 +0000 UTC m=+3046.866527871" Nov 22 03:44:42 crc kubenswrapper[4952]: I1122 03:44:42.569346 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5675778f5b-wg7px"] Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.047074 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.120865 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-combined-ca-bundle\") pod \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.155393 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce" (UID: "5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.222314 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-scripts\") pod \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.222394 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-config-data\") pod \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.222450 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-ceph\") pod \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.222480 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-logs\") pod \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.222699 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmk8d\" (UniqueName: \"kubernetes.io/projected/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-kube-api-access-bmk8d\") pod \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.222808 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-httpd-run\") pod \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.222846 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-public-tls-certs\") pod \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.222903 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\" (UID: \"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce\") " Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.223267 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-logs" (OuterVolumeSpecName: "logs") pod "5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce" (UID: "5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.223838 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.223856 4952 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.224253 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce" (UID: "5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.227120 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-ceph" (OuterVolumeSpecName: "ceph") pod "5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce" (UID: "5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.227351 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-scripts" (OuterVolumeSpecName: "scripts") pod "5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce" (UID: "5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.230459 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce" (UID: "5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.232299 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-kube-api-access-bmk8d" (OuterVolumeSpecName: "kube-api-access-bmk8d") pod "5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce" (UID: "5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce"). InnerVolumeSpecName "kube-api-access-bmk8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.281182 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.295434 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.301681 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce" (UID: "5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.305581 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-config-data" (OuterVolumeSpecName: "config-data") pod "5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce" (UID: "5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.325404 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmk8d\" (UniqueName: \"kubernetes.io/projected/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-kube-api-access-bmk8d\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.325443 4952 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.325457 4952 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.325503 4952 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.325516 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.325529 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.325539 4952 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.354876 4952 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.427392 4952 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.534971 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56848f9f44-m7z42" event={"ID":"ea3db97f-72b6-4eaa-b8ea-256a5691008f","Type":"ContainerStarted","Data":"13c26a1845fe2373bb25dc055bd57544617ebf58165966d932bcff7a1db65cb5"} Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.537246 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5675778f5b-wg7px" event={"ID":"d8eb2643-4d7c-4814-91fe-1192d3fc753d","Type":"ContainerStarted","Data":"7e6136528ed73b43e4e674675c3fa3ed5abea6dbcad3ec5608d87440097114d6"} Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.542807 4952 generic.go:334] "Generic (PLEG): container finished" podID="b8b17cec-6979-496c-8e36-40060c31ed82" containerID="7a15769e6d4aec4af039fcac903214acfffd208ad2b150d16b025e863d65c5c6" exitCode=143 Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.542849 4952 generic.go:334] "Generic (PLEG): container finished" podID="b8b17cec-6979-496c-8e36-40060c31ed82" containerID="fd57823adc23072fab45f6b18e1808582e349460f56b3213ac7cf7bab23ec150" exitCode=143 Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.542901 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b8b17cec-6979-496c-8e36-40060c31ed82","Type":"ContainerDied","Data":"7a15769e6d4aec4af039fcac903214acfffd208ad2b150d16b025e863d65c5c6"} Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.542926 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b8b17cec-6979-496c-8e36-40060c31ed82","Type":"ContainerDied","Data":"fd57823adc23072fab45f6b18e1808582e349460f56b3213ac7cf7bab23ec150"} Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.545290 4952 generic.go:334] "Generic (PLEG): container finished" podID="5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce" containerID="6e964be88e927800a48d39d3cef1a41617d5879549ee2aa8cdfecf05e96c578a" exitCode=143 Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.545318 4952 generic.go:334] "Generic (PLEG): container finished" podID="5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce" containerID="28de63ae3c9677796784c457a210ca34cb92d3170ae937bb314f2706ff8e7b66" exitCode=143 Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.545376 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.545396 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce","Type":"ContainerDied","Data":"6e964be88e927800a48d39d3cef1a41617d5879549ee2aa8cdfecf05e96c578a"} Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.545434 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce","Type":"ContainerDied","Data":"28de63ae3c9677796784c457a210ca34cb92d3170ae937bb314f2706ff8e7b66"} Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.545467 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce","Type":"ContainerDied","Data":"eb79dab2161b79e88c6fa19d8eb06e32c96ef82ef02da5f31123e7392db7a78d"} Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.545488 4952 scope.go:117] "RemoveContainer" containerID="6e964be88e927800a48d39d3cef1a41617d5879549ee2aa8cdfecf05e96c578a" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.589559 4952 scope.go:117] "RemoveContainer" containerID="28de63ae3c9677796784c457a210ca34cb92d3170ae937bb314f2706ff8e7b66" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.598477 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.621690 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.624536 4952 scope.go:117] "RemoveContainer" containerID="6e964be88e927800a48d39d3cef1a41617d5879549ee2aa8cdfecf05e96c578a" Nov 22 03:44:43 crc kubenswrapper[4952]: E1122 03:44:43.624999 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e964be88e927800a48d39d3cef1a41617d5879549ee2aa8cdfecf05e96c578a\": container with ID starting with 6e964be88e927800a48d39d3cef1a41617d5879549ee2aa8cdfecf05e96c578a not found: ID does not exist" containerID="6e964be88e927800a48d39d3cef1a41617d5879549ee2aa8cdfecf05e96c578a" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.625040 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e964be88e927800a48d39d3cef1a41617d5879549ee2aa8cdfecf05e96c578a"} err="failed to get container status \"6e964be88e927800a48d39d3cef1a41617d5879549ee2aa8cdfecf05e96c578a\": rpc error: code = NotFound desc = could not find container \"6e964be88e927800a48d39d3cef1a41617d5879549ee2aa8cdfecf05e96c578a\": container with ID starting with 6e964be88e927800a48d39d3cef1a41617d5879549ee2aa8cdfecf05e96c578a not found: ID does not exist" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.625066 4952 scope.go:117] "RemoveContainer" containerID="28de63ae3c9677796784c457a210ca34cb92d3170ae937bb314f2706ff8e7b66" Nov 22 03:44:43 crc kubenswrapper[4952]: E1122 03:44:43.625429 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28de63ae3c9677796784c457a210ca34cb92d3170ae937bb314f2706ff8e7b66\": container with ID starting with 28de63ae3c9677796784c457a210ca34cb92d3170ae937bb314f2706ff8e7b66 not found: ID does not exist" containerID="28de63ae3c9677796784c457a210ca34cb92d3170ae937bb314f2706ff8e7b66" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.625453 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28de63ae3c9677796784c457a210ca34cb92d3170ae937bb314f2706ff8e7b66"} err="failed to get container status \"28de63ae3c9677796784c457a210ca34cb92d3170ae937bb314f2706ff8e7b66\": rpc error: code = NotFound desc = could not find container \"28de63ae3c9677796784c457a210ca34cb92d3170ae937bb314f2706ff8e7b66\": container with ID starting with 28de63ae3c9677796784c457a210ca34cb92d3170ae937bb314f2706ff8e7b66 not found: ID does not exist" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.625470 4952 scope.go:117] "RemoveContainer" containerID="6e964be88e927800a48d39d3cef1a41617d5879549ee2aa8cdfecf05e96c578a" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.625788 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e964be88e927800a48d39d3cef1a41617d5879549ee2aa8cdfecf05e96c578a"} err="failed to get container status \"6e964be88e927800a48d39d3cef1a41617d5879549ee2aa8cdfecf05e96c578a\": rpc error: code = NotFound desc = could not find container \"6e964be88e927800a48d39d3cef1a41617d5879549ee2aa8cdfecf05e96c578a\": container with ID starting with 6e964be88e927800a48d39d3cef1a41617d5879549ee2aa8cdfecf05e96c578a not found: ID does not exist" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.625810 4952 scope.go:117] "RemoveContainer" containerID="28de63ae3c9677796784c457a210ca34cb92d3170ae937bb314f2706ff8e7b66" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.626027 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28de63ae3c9677796784c457a210ca34cb92d3170ae937bb314f2706ff8e7b66"} err="failed to get container status \"28de63ae3c9677796784c457a210ca34cb92d3170ae937bb314f2706ff8e7b66\": rpc error: code = NotFound desc = could not find container \"28de63ae3c9677796784c457a210ca34cb92d3170ae937bb314f2706ff8e7b66\": container with ID starting with 28de63ae3c9677796784c457a210ca34cb92d3170ae937bb314f2706ff8e7b66 not found: ID does not exist" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.631731 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 03:44:43 crc kubenswrapper[4952]: E1122 03:44:43.632138 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b133bfea-bd18-486b-8ab0-12bba9c84fe6" containerName="mariadb-account-create" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.632154 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="b133bfea-bd18-486b-8ab0-12bba9c84fe6" containerName="mariadb-account-create" Nov 22 03:44:43 crc kubenswrapper[4952]: E1122 03:44:43.632166 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce" containerName="glance-httpd" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.632172 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce" containerName="glance-httpd" Nov 22 03:44:43 crc kubenswrapper[4952]: E1122 03:44:43.632202 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab2b754-fb45-442c-8efd-feca454390f3" containerName="mariadb-database-create" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.632208 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab2b754-fb45-442c-8efd-feca454390f3" containerName="mariadb-database-create" Nov 22 03:44:43 crc kubenswrapper[4952]: E1122 03:44:43.632221 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce" containerName="glance-log" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.632227 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce" containerName="glance-log" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.632424 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce" containerName="glance-log" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.632446 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce" containerName="glance-httpd" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.632460 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="b133bfea-bd18-486b-8ab0-12bba9c84fe6" containerName="mariadb-account-create" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.632472 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="aab2b754-fb45-442c-8efd-feca454390f3" containerName="mariadb-database-create" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.633646 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.639730 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.639812 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.640524 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.732155 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dcd5f8-1d52-42ab-a481-313bf4f5148d-config-data\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.732592 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/72dcd5f8-1d52-42ab-a481-313bf4f5148d-ceph\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.732670 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.732743 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dcd5f8-1d52-42ab-a481-313bf4f5148d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.732786 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72dcd5f8-1d52-42ab-a481-313bf4f5148d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.732808 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72dcd5f8-1d52-42ab-a481-313bf4f5148d-logs\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.732827 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/72dcd5f8-1d52-42ab-a481-313bf4f5148d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.732879 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6rnx\" (UniqueName: \"kubernetes.io/projected/72dcd5f8-1d52-42ab-a481-313bf4f5148d-kube-api-access-l6rnx\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.733056 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72dcd5f8-1d52-42ab-a481-313bf4f5148d-scripts\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.834145 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72dcd5f8-1d52-42ab-a481-313bf4f5148d-scripts\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.834203 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dcd5f8-1d52-42ab-a481-313bf4f5148d-config-data\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.834907 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/72dcd5f8-1d52-42ab-a481-313bf4f5148d-ceph\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.834964 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.838811 4952 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.843407 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/72dcd5f8-1d52-42ab-a481-313bf4f5148d-ceph\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.846964 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dcd5f8-1d52-42ab-a481-313bf4f5148d-config-data\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.857412 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dcd5f8-1d52-42ab-a481-313bf4f5148d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.860125 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72dcd5f8-1d52-42ab-a481-313bf4f5148d-scripts\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.835010 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dcd5f8-1d52-42ab-a481-313bf4f5148d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.869192 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72dcd5f8-1d52-42ab-a481-313bf4f5148d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.869266 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72dcd5f8-1d52-42ab-a481-313bf4f5148d-logs\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.869302 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/72dcd5f8-1d52-42ab-a481-313bf4f5148d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.869382 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6rnx\" (UniqueName: \"kubernetes.io/projected/72dcd5f8-1d52-42ab-a481-313bf4f5148d-kube-api-access-l6rnx\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.873973 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72dcd5f8-1d52-42ab-a481-313bf4f5148d-logs\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.877486 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/72dcd5f8-1d52-42ab-a481-313bf4f5148d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.880906 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.889423 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72dcd5f8-1d52-42ab-a481-313bf4f5148d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.898640 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6rnx\" (UniqueName: \"kubernetes.io/projected/72dcd5f8-1d52-42ab-a481-313bf4f5148d-kube-api-access-l6rnx\") pod \"glance-default-external-api-0\" (UID: \"72dcd5f8-1d52-42ab-a481-313bf4f5148d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.974942 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.984157 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-krnxz"] Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.987559 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-krnxz" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.990608 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-wds6g" Nov 22 03:44:43 crc kubenswrapper[4952]: I1122 03:44:43.995032 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-krnxz"] Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.000793 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.153423 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.190643 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701fbc88-a1f6-493d-b6cf-3397073a8d9d-config-data\") pod \"manila-db-sync-krnxz\" (UID: \"701fbc88-a1f6-493d-b6cf-3397073a8d9d\") " pod="openstack/manila-db-sync-krnxz" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.190724 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701fbc88-a1f6-493d-b6cf-3397073a8d9d-combined-ca-bundle\") pod \"manila-db-sync-krnxz\" (UID: \"701fbc88-a1f6-493d-b6cf-3397073a8d9d\") " pod="openstack/manila-db-sync-krnxz" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.191930 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd9z6\" (UniqueName: \"kubernetes.io/projected/701fbc88-a1f6-493d-b6cf-3397073a8d9d-kube-api-access-qd9z6\") pod \"manila-db-sync-krnxz\" (UID: \"701fbc88-a1f6-493d-b6cf-3397073a8d9d\") " pod="openstack/manila-db-sync-krnxz" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.192038 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/701fbc88-a1f6-493d-b6cf-3397073a8d9d-job-config-data\") pod \"manila-db-sync-krnxz\" (UID: \"701fbc88-a1f6-493d-b6cf-3397073a8d9d\") " pod="openstack/manila-db-sync-krnxz" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.293861 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8b17cec-6979-496c-8e36-40060c31ed82-logs\") pod \"b8b17cec-6979-496c-8e36-40060c31ed82\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.293906 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-combined-ca-bundle\") pod \"b8b17cec-6979-496c-8e36-40060c31ed82\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.294067 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b8b17cec-6979-496c-8e36-40060c31ed82-httpd-run\") pod \"b8b17cec-6979-496c-8e36-40060c31ed82\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.294126 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b8b17cec-6979-496c-8e36-40060c31ed82-ceph\") pod \"b8b17cec-6979-496c-8e36-40060c31ed82\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.294142 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"b8b17cec-6979-496c-8e36-40060c31ed82\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.294164 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-config-data\") pod \"b8b17cec-6979-496c-8e36-40060c31ed82\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.294243 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-internal-tls-certs\") pod \"b8b17cec-6979-496c-8e36-40060c31ed82\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.294282 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkrfc\" (UniqueName: \"kubernetes.io/projected/b8b17cec-6979-496c-8e36-40060c31ed82-kube-api-access-pkrfc\") pod \"b8b17cec-6979-496c-8e36-40060c31ed82\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.294323 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-scripts\") pod \"b8b17cec-6979-496c-8e36-40060c31ed82\" (UID: \"b8b17cec-6979-496c-8e36-40060c31ed82\") " Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.294666 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701fbc88-a1f6-493d-b6cf-3397073a8d9d-config-data\") pod \"manila-db-sync-krnxz\" (UID: \"701fbc88-a1f6-493d-b6cf-3397073a8d9d\") " pod="openstack/manila-db-sync-krnxz" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.294713 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701fbc88-a1f6-493d-b6cf-3397073a8d9d-combined-ca-bundle\") pod \"manila-db-sync-krnxz\" (UID: \"701fbc88-a1f6-493d-b6cf-3397073a8d9d\") " pod="openstack/manila-db-sync-krnxz" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.294761 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd9z6\" (UniqueName: \"kubernetes.io/projected/701fbc88-a1f6-493d-b6cf-3397073a8d9d-kube-api-access-qd9z6\") pod \"manila-db-sync-krnxz\" (UID: \"701fbc88-a1f6-493d-b6cf-3397073a8d9d\") " pod="openstack/manila-db-sync-krnxz" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.294813 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/701fbc88-a1f6-493d-b6cf-3397073a8d9d-job-config-data\") pod \"manila-db-sync-krnxz\" (UID: \"701fbc88-a1f6-493d-b6cf-3397073a8d9d\") " pod="openstack/manila-db-sync-krnxz" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.295044 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8b17cec-6979-496c-8e36-40060c31ed82-logs" (OuterVolumeSpecName: "logs") pod "b8b17cec-6979-496c-8e36-40060c31ed82" (UID: "b8b17cec-6979-496c-8e36-40060c31ed82"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.295556 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8b17cec-6979-496c-8e36-40060c31ed82-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b8b17cec-6979-496c-8e36-40060c31ed82" (UID: "b8b17cec-6979-496c-8e36-40060c31ed82"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.303603 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/701fbc88-a1f6-493d-b6cf-3397073a8d9d-job-config-data\") pod \"manila-db-sync-krnxz\" (UID: \"701fbc88-a1f6-493d-b6cf-3397073a8d9d\") " pod="openstack/manila-db-sync-krnxz" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.304533 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-scripts" (OuterVolumeSpecName: "scripts") pod "b8b17cec-6979-496c-8e36-40060c31ed82" (UID: "b8b17cec-6979-496c-8e36-40060c31ed82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.304767 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701fbc88-a1f6-493d-b6cf-3397073a8d9d-config-data\") pod \"manila-db-sync-krnxz\" (UID: \"701fbc88-a1f6-493d-b6cf-3397073a8d9d\") " pod="openstack/manila-db-sync-krnxz" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.306089 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "b8b17cec-6979-496c-8e36-40060c31ed82" (UID: "b8b17cec-6979-496c-8e36-40060c31ed82"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.306400 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701fbc88-a1f6-493d-b6cf-3397073a8d9d-combined-ca-bundle\") pod \"manila-db-sync-krnxz\" (UID: \"701fbc88-a1f6-493d-b6cf-3397073a8d9d\") " pod="openstack/manila-db-sync-krnxz" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.312229 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8b17cec-6979-496c-8e36-40060c31ed82-kube-api-access-pkrfc" (OuterVolumeSpecName: "kube-api-access-pkrfc") pod "b8b17cec-6979-496c-8e36-40060c31ed82" (UID: "b8b17cec-6979-496c-8e36-40060c31ed82"). InnerVolumeSpecName "kube-api-access-pkrfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.312302 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8b17cec-6979-496c-8e36-40060c31ed82-ceph" (OuterVolumeSpecName: "ceph") pod "b8b17cec-6979-496c-8e36-40060c31ed82" (UID: "b8b17cec-6979-496c-8e36-40060c31ed82"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.328094 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd9z6\" (UniqueName: \"kubernetes.io/projected/701fbc88-a1f6-493d-b6cf-3397073a8d9d-kube-api-access-qd9z6\") pod \"manila-db-sync-krnxz\" (UID: \"701fbc88-a1f6-493d-b6cf-3397073a8d9d\") " pod="openstack/manila-db-sync-krnxz" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.351696 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-krnxz" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.375802 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8b17cec-6979-496c-8e36-40060c31ed82" (UID: "b8b17cec-6979-496c-8e36-40060c31ed82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.400226 4952 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8b17cec-6979-496c-8e36-40060c31ed82-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.400257 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.400269 4952 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b8b17cec-6979-496c-8e36-40060c31ed82-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.400280 4952 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b8b17cec-6979-496c-8e36-40060c31ed82-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.400306 4952 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.400316 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkrfc\" (UniqueName: \"kubernetes.io/projected/b8b17cec-6979-496c-8e36-40060c31ed82-kube-api-access-pkrfc\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.400325 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.406879 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-config-data" (OuterVolumeSpecName: "config-data") pod "b8b17cec-6979-496c-8e36-40060c31ed82" (UID: "b8b17cec-6979-496c-8e36-40060c31ed82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.419730 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b8b17cec-6979-496c-8e36-40060c31ed82" (UID: "b8b17cec-6979-496c-8e36-40060c31ed82"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.456966 4952 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.501793 4952 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.501833 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.501843 4952 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8b17cec-6979-496c-8e36-40060c31ed82-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.531876 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:44:44 crc kubenswrapper[4952]: E1122 03:44:44.532578 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.558452 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce" path="/var/lib/kubelet/pods/5a5527a7-6ad7-4c80-96c9-3cf94f2cc6ce/volumes" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.583347 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b8b17cec-6979-496c-8e36-40060c31ed82","Type":"ContainerDied","Data":"cc5298b811d0c584151d4c8c1387f1fc00ccae31b892cc82dda43d0a0fe943ea"} Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.583398 4952 scope.go:117] "RemoveContainer" containerID="7a15769e6d4aec4af039fcac903214acfffd208ad2b150d16b025e863d65c5c6" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.583567 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.669796 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.692513 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.722067 4952 scope.go:117] "RemoveContainer" containerID="fd57823adc23072fab45f6b18e1808582e349460f56b3213ac7cf7bab23ec150" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.723052 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.749889 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 03:44:44 crc kubenswrapper[4952]: E1122 03:44:44.750381 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b17cec-6979-496c-8e36-40060c31ed82" containerName="glance-httpd" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.750397 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b17cec-6979-496c-8e36-40060c31ed82" containerName="glance-httpd" Nov 22 03:44:44 crc kubenswrapper[4952]: E1122 03:44:44.750426 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b17cec-6979-496c-8e36-40060c31ed82" containerName="glance-log" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.750432 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b17cec-6979-496c-8e36-40060c31ed82" containerName="glance-log" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.750669 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8b17cec-6979-496c-8e36-40060c31ed82" containerName="glance-log" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.750696 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8b17cec-6979-496c-8e36-40060c31ed82" containerName="glance-httpd" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.751997 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.759415 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.759800 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.763345 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.913190 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39989530-1e90-4cb1-b7c5-681a9bdb322b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.913673 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39989530-1e90-4cb1-b7c5-681a9bdb322b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.913705 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.913739 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39989530-1e90-4cb1-b7c5-681a9bdb322b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.914140 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfcq5\" (UniqueName: \"kubernetes.io/projected/39989530-1e90-4cb1-b7c5-681a9bdb322b-kube-api-access-pfcq5\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.914242 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39989530-1e90-4cb1-b7c5-681a9bdb322b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.914381 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39989530-1e90-4cb1-b7c5-681a9bdb322b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.914428 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/39989530-1e90-4cb1-b7c5-681a9bdb322b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:44 crc kubenswrapper[4952]: I1122 03:44:44.914443 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39989530-1e90-4cb1-b7c5-681a9bdb322b-logs\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.016488 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39989530-1e90-4cb1-b7c5-681a9bdb322b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.016549 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39989530-1e90-4cb1-b7c5-681a9bdb322b-logs\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.016568 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/39989530-1e90-4cb1-b7c5-681a9bdb322b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.016615 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39989530-1e90-4cb1-b7c5-681a9bdb322b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.016640 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39989530-1e90-4cb1-b7c5-681a9bdb322b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.016664 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.016747 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39989530-1e90-4cb1-b7c5-681a9bdb322b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.016832 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfcq5\" (UniqueName: \"kubernetes.io/projected/39989530-1e90-4cb1-b7c5-681a9bdb322b-kube-api-access-pfcq5\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.016864 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39989530-1e90-4cb1-b7c5-681a9bdb322b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.019176 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39989530-1e90-4cb1-b7c5-681a9bdb322b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.019374 4952 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.019507 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39989530-1e90-4cb1-b7c5-681a9bdb322b-logs\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.024327 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39989530-1e90-4cb1-b7c5-681a9bdb322b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.026488 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39989530-1e90-4cb1-b7c5-681a9bdb322b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.028401 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39989530-1e90-4cb1-b7c5-681a9bdb322b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.029903 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/39989530-1e90-4cb1-b7c5-681a9bdb322b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.039738 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39989530-1e90-4cb1-b7c5-681a9bdb322b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.048847 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfcq5\" (UniqueName: \"kubernetes.io/projected/39989530-1e90-4cb1-b7c5-681a9bdb322b-kube-api-access-pfcq5\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.055986 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-krnxz"] Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.059504 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"39989530-1e90-4cb1-b7c5-681a9bdb322b\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: W1122 03:44:45.059924 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod701fbc88_a1f6_493d_b6cf_3397073a8d9d.slice/crio-8a2ac23f53435e6ac62a0ba8c8f0c99e42ca7cb88fc33555b4e96e9f64996cc4 WatchSource:0}: Error finding container 8a2ac23f53435e6ac62a0ba8c8f0c99e42ca7cb88fc33555b4e96e9f64996cc4: Status 404 returned error can't find the container with id 8a2ac23f53435e6ac62a0ba8c8f0c99e42ca7cb88fc33555b4e96e9f64996cc4 Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.082335 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.699595 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"72dcd5f8-1d52-42ab-a481-313bf4f5148d","Type":"ContainerStarted","Data":"e75453f2194341f4105d90777755f3ed2f2909965c7af6d1e45229a72d33e9d4"} Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.722903 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-krnxz" event={"ID":"701fbc88-a1f6-493d-b6cf-3397073a8d9d","Type":"ContainerStarted","Data":"8a2ac23f53435e6ac62a0ba8c8f0c99e42ca7cb88fc33555b4e96e9f64996cc4"} Nov 22 03:44:45 crc kubenswrapper[4952]: I1122 03:44:45.770935 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 03:44:46 crc kubenswrapper[4952]: I1122 03:44:46.543415 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8b17cec-6979-496c-8e36-40060c31ed82" path="/var/lib/kubelet/pods/b8b17cec-6979-496c-8e36-40060c31ed82/volumes" Nov 22 03:44:46 crc kubenswrapper[4952]: I1122 03:44:46.745123 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"72dcd5f8-1d52-42ab-a481-313bf4f5148d","Type":"ContainerStarted","Data":"1dff881a8104fe8dc66b599935d829e604c288f0a86ced8336cf733cff0cfe6b"} Nov 22 03:44:48 crc kubenswrapper[4952]: I1122 03:44:48.518293 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Nov 22 03:44:48 crc kubenswrapper[4952]: I1122 03:44:48.526319 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Nov 22 03:44:51 crc kubenswrapper[4952]: I1122 03:44:51.804487 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"39989530-1e90-4cb1-b7c5-681a9bdb322b","Type":"ContainerStarted","Data":"cddfe8886584d2f43b52be4732ff92c6af9c12c3e0ebbcff6df80e88e4dbcf56"} Nov 22 03:44:52 crc kubenswrapper[4952]: I1122 03:44:52.818675 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56848f9f44-m7z42" event={"ID":"ea3db97f-72b6-4eaa-b8ea-256a5691008f","Type":"ContainerStarted","Data":"f9c3530d9fbeb51f0cac56d37bd9139788195604faa791963a5fbf818f76ce43"} Nov 22 03:44:52 crc kubenswrapper[4952]: I1122 03:44:52.819285 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56848f9f44-m7z42" event={"ID":"ea3db97f-72b6-4eaa-b8ea-256a5691008f","Type":"ContainerStarted","Data":"8022905d9603fbeb7239242db090c53873e51d35ba9eba6c5e3ee1f6a3984be8"} Nov 22 03:44:52 crc kubenswrapper[4952]: I1122 03:44:52.822902 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d47b8cc7-k9qjl" event={"ID":"17666473-50ea-48ef-afc8-265daec6df33","Type":"ContainerStarted","Data":"b0b99556c4447ad4ea24d77abc7a4a125cea9dff413425d71d0913f88307cd99"} Nov 22 03:44:52 crc kubenswrapper[4952]: I1122 03:44:52.822949 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d47b8cc7-k9qjl" event={"ID":"17666473-50ea-48ef-afc8-265daec6df33","Type":"ContainerStarted","Data":"373d0be2b365981de93ade81dad242a9026128931cdac56197415316f0e80264"} Nov 22 03:44:52 crc kubenswrapper[4952]: I1122 03:44:52.822994 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-69d47b8cc7-k9qjl" podUID="17666473-50ea-48ef-afc8-265daec6df33" containerName="horizon-log" containerID="cri-o://373d0be2b365981de93ade81dad242a9026128931cdac56197415316f0e80264" gracePeriod=30 Nov 22 03:44:52 crc kubenswrapper[4952]: I1122 03:44:52.823008 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-69d47b8cc7-k9qjl" podUID="17666473-50ea-48ef-afc8-265daec6df33" containerName="horizon" containerID="cri-o://b0b99556c4447ad4ea24d77abc7a4a125cea9dff413425d71d0913f88307cd99" gracePeriod=30 Nov 22 03:44:52 crc kubenswrapper[4952]: I1122 03:44:52.827633 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"72dcd5f8-1d52-42ab-a481-313bf4f5148d","Type":"ContainerStarted","Data":"d53e538e7bb015765cb41a4f8c52349314f7834e30c5bfd26f27faa695194e95"} Nov 22 03:44:52 crc kubenswrapper[4952]: I1122 03:44:52.830152 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5675778f5b-wg7px" event={"ID":"d8eb2643-4d7c-4814-91fe-1192d3fc753d","Type":"ContainerStarted","Data":"3048b2f97eb28bedd926106b8d79a7305914bbde006d9e89768a2659d575f71f"} Nov 22 03:44:52 crc kubenswrapper[4952]: I1122 03:44:52.830177 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5675778f5b-wg7px" event={"ID":"d8eb2643-4d7c-4814-91fe-1192d3fc753d","Type":"ContainerStarted","Data":"fcb3f4c158aff30431cd1c8359c3e7a82cf2d483250cad37132024d0df9ad257"} Nov 22 03:44:52 crc kubenswrapper[4952]: I1122 03:44:52.837219 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"39989530-1e90-4cb1-b7c5-681a9bdb322b","Type":"ContainerStarted","Data":"70868c9f1b2fd49ef0522b2d721cb8896c28ebad2b44bde32d61221cb824d31c"} Nov 22 03:44:52 crc kubenswrapper[4952]: I1122 03:44:52.848790 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dccb8556f-cmgsg" event={"ID":"d455bb45-949a-444f-bf5f-61736fbe9c28","Type":"ContainerStarted","Data":"78fcf1edf83404e2d54eb637ecfc495cc2dfbfac93f2f7dd76ac810b4f72b4c6"} Nov 22 03:44:52 crc kubenswrapper[4952]: I1122 03:44:52.848852 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dccb8556f-cmgsg" event={"ID":"d455bb45-949a-444f-bf5f-61736fbe9c28","Type":"ContainerStarted","Data":"f44b471c515b8a23345147500d06bd1bde634dadead4dcd0277c040da9f87a25"} Nov 22 03:44:52 crc kubenswrapper[4952]: I1122 03:44:52.849031 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-dccb8556f-cmgsg" podUID="d455bb45-949a-444f-bf5f-61736fbe9c28" containerName="horizon-log" containerID="cri-o://f44b471c515b8a23345147500d06bd1bde634dadead4dcd0277c040da9f87a25" gracePeriod=30 Nov 22 03:44:52 crc kubenswrapper[4952]: I1122 03:44:52.849172 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-dccb8556f-cmgsg" podUID="d455bb45-949a-444f-bf5f-61736fbe9c28" containerName="horizon" containerID="cri-o://78fcf1edf83404e2d54eb637ecfc495cc2dfbfac93f2f7dd76ac810b4f72b4c6" gracePeriod=30 Nov 22 03:44:52 crc kubenswrapper[4952]: I1122 03:44:52.886498 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-56848f9f44-m7z42" podStartSLOduration=2.691683057 podStartE2EDuration="11.886474557s" podCreationTimestamp="2025-11-22 03:44:41 +0000 UTC" firstStartedPulling="2025-11-22 03:44:42.534704825 +0000 UTC m=+3046.840722098" lastFinishedPulling="2025-11-22 03:44:51.729496335 +0000 UTC m=+3056.035513598" observedRunningTime="2025-11-22 03:44:52.843339633 +0000 UTC m=+3057.149356906" watchObservedRunningTime="2025-11-22 03:44:52.886474557 +0000 UTC m=+3057.192491830" Nov 22 03:44:52 crc kubenswrapper[4952]: I1122 03:44:52.920870 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.920848558 podStartE2EDuration="9.920848558s" podCreationTimestamp="2025-11-22 03:44:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:44:52.865612553 +0000 UTC m=+3057.171629826" watchObservedRunningTime="2025-11-22 03:44:52.920848558 +0000 UTC m=+3057.226865841" Nov 22 03:44:52 crc kubenswrapper[4952]: I1122 03:44:52.940373 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5675778f5b-wg7px" podStartSLOduration=2.813916607 podStartE2EDuration="11.940353335s" podCreationTimestamp="2025-11-22 03:44:41 +0000 UTC" firstStartedPulling="2025-11-22 03:44:42.560164799 +0000 UTC m=+3046.866182072" lastFinishedPulling="2025-11-22 03:44:51.686601527 +0000 UTC m=+3055.992618800" observedRunningTime="2025-11-22 03:44:52.901211487 +0000 UTC m=+3057.207228790" watchObservedRunningTime="2025-11-22 03:44:52.940353335 +0000 UTC m=+3057.246370608" Nov 22 03:44:52 crc kubenswrapper[4952]: I1122 03:44:52.962706 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-69d47b8cc7-k9qjl" podStartSLOduration=3.476728607 podStartE2EDuration="14.962682677s" podCreationTimestamp="2025-11-22 03:44:38 +0000 UTC" firstStartedPulling="2025-11-22 03:44:40.161470248 +0000 UTC m=+3044.467487521" lastFinishedPulling="2025-11-22 03:44:51.647424308 +0000 UTC m=+3055.953441591" observedRunningTime="2025-11-22 03:44:52.925006168 +0000 UTC m=+3057.231023441" watchObservedRunningTime="2025-11-22 03:44:52.962682677 +0000 UTC m=+3057.268699950" Nov 22 03:44:52 crc kubenswrapper[4952]: I1122 03:44:52.978888 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.978870087 podStartE2EDuration="8.978870087s" podCreationTimestamp="2025-11-22 03:44:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:44:52.96466228 +0000 UTC m=+3057.270679553" watchObservedRunningTime="2025-11-22 03:44:52.978870087 +0000 UTC m=+3057.284887350" Nov 22 03:44:52 crc kubenswrapper[4952]: I1122 03:44:52.986895 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-dccb8556f-cmgsg" podStartSLOduration=2.957471997 podStartE2EDuration="14.98687921s" podCreationTimestamp="2025-11-22 03:44:38 +0000 UTC" firstStartedPulling="2025-11-22 03:44:39.620020048 +0000 UTC m=+3043.926037321" lastFinishedPulling="2025-11-22 03:44:51.649427211 +0000 UTC m=+3055.955444534" observedRunningTime="2025-11-22 03:44:52.985948695 +0000 UTC m=+3057.291965978" watchObservedRunningTime="2025-11-22 03:44:52.98687921 +0000 UTC m=+3057.292896483" Nov 22 03:44:53 crc kubenswrapper[4952]: I1122 03:44:53.863936 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"39989530-1e90-4cb1-b7c5-681a9bdb322b","Type":"ContainerStarted","Data":"511c5ddbc3f7f7246f5b774804b2c33f27576c09786a7c216f0105be84768e98"} Nov 22 03:44:53 crc kubenswrapper[4952]: I1122 03:44:53.975281 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 22 03:44:53 crc kubenswrapper[4952]: I1122 03:44:53.976561 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 22 03:44:54 crc kubenswrapper[4952]: I1122 03:44:54.022135 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 22 03:44:54 crc kubenswrapper[4952]: I1122 03:44:54.030522 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 22 03:44:54 crc kubenswrapper[4952]: I1122 03:44:54.876911 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 03:44:54 crc kubenswrapper[4952]: I1122 03:44:54.876954 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 03:44:55 crc kubenswrapper[4952]: I1122 03:44:55.084065 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 22 03:44:55 crc kubenswrapper[4952]: I1122 03:44:55.084125 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 22 03:44:55 crc kubenswrapper[4952]: I1122 03:44:55.122589 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 22 03:44:55 crc kubenswrapper[4952]: I1122 03:44:55.147360 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 22 03:44:55 crc kubenswrapper[4952]: I1122 03:44:55.886033 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 03:44:55 crc kubenswrapper[4952]: I1122 03:44:55.893022 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 03:44:58 crc kubenswrapper[4952]: I1122 03:44:58.913118 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-krnxz" event={"ID":"701fbc88-a1f6-493d-b6cf-3397073a8d9d","Type":"ContainerStarted","Data":"d1c3fbba899232dfe942c19bcb5664ac1ec804e3b21d27dd8b09313b440ba735"} Nov 22 03:44:58 crc kubenswrapper[4952]: I1122 03:44:58.933828 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-krnxz" podStartSLOduration=2.774978454 podStartE2EDuration="15.933812538s" podCreationTimestamp="2025-11-22 03:44:43 +0000 UTC" firstStartedPulling="2025-11-22 03:44:45.067480082 +0000 UTC m=+3049.373497355" lastFinishedPulling="2025-11-22 03:44:58.226314166 +0000 UTC m=+3062.532331439" observedRunningTime="2025-11-22 03:44:58.928015904 +0000 UTC m=+3063.234033177" watchObservedRunningTime="2025-11-22 03:44:58.933812538 +0000 UTC m=+3063.239829811" Nov 22 03:44:59 crc kubenswrapper[4952]: I1122 03:44:59.019481 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-dccb8556f-cmgsg" Nov 22 03:44:59 crc kubenswrapper[4952]: I1122 03:44:59.123182 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-69d47b8cc7-k9qjl" Nov 22 03:44:59 crc kubenswrapper[4952]: I1122 03:44:59.532263 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:44:59 crc kubenswrapper[4952]: E1122 03:44:59.532480 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:44:59 crc kubenswrapper[4952]: I1122 03:44:59.875047 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 22 03:44:59 crc kubenswrapper[4952]: I1122 03:44:59.881638 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 22 03:44:59 crc kubenswrapper[4952]: I1122 03:44:59.896306 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 22 03:44:59 crc kubenswrapper[4952]: I1122 03:44:59.913071 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 22 03:45:00 crc kubenswrapper[4952]: I1122 03:45:00.153692 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw"] Nov 22 03:45:00 crc kubenswrapper[4952]: I1122 03:45:00.165768 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw"] Nov 22 03:45:00 crc kubenswrapper[4952]: I1122 03:45:00.165871 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw" Nov 22 03:45:00 crc kubenswrapper[4952]: I1122 03:45:00.177236 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 03:45:00 crc kubenswrapper[4952]: I1122 03:45:00.177713 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 03:45:00 crc kubenswrapper[4952]: I1122 03:45:00.257011 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92ab05d2-2b72-4a8b-b649-6802aba4fc7d-config-volume\") pod \"collect-profiles-29396385-r8zfw\" (UID: \"92ab05d2-2b72-4a8b-b649-6802aba4fc7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw" Nov 22 03:45:00 crc kubenswrapper[4952]: I1122 03:45:00.257068 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7zcf\" (UniqueName: \"kubernetes.io/projected/92ab05d2-2b72-4a8b-b649-6802aba4fc7d-kube-api-access-g7zcf\") pod \"collect-profiles-29396385-r8zfw\" (UID: \"92ab05d2-2b72-4a8b-b649-6802aba4fc7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw" Nov 22 03:45:00 crc kubenswrapper[4952]: I1122 03:45:00.257104 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92ab05d2-2b72-4a8b-b649-6802aba4fc7d-secret-volume\") pod \"collect-profiles-29396385-r8zfw\" (UID: \"92ab05d2-2b72-4a8b-b649-6802aba4fc7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw" Nov 22 03:45:00 crc kubenswrapper[4952]: I1122 03:45:00.359062 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92ab05d2-2b72-4a8b-b649-6802aba4fc7d-config-volume\") pod \"collect-profiles-29396385-r8zfw\" (UID: \"92ab05d2-2b72-4a8b-b649-6802aba4fc7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw" Nov 22 03:45:00 crc kubenswrapper[4952]: I1122 03:45:00.359124 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7zcf\" (UniqueName: \"kubernetes.io/projected/92ab05d2-2b72-4a8b-b649-6802aba4fc7d-kube-api-access-g7zcf\") pod \"collect-profiles-29396385-r8zfw\" (UID: \"92ab05d2-2b72-4a8b-b649-6802aba4fc7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw" Nov 22 03:45:00 crc kubenswrapper[4952]: I1122 03:45:00.359186 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92ab05d2-2b72-4a8b-b649-6802aba4fc7d-secret-volume\") pod \"collect-profiles-29396385-r8zfw\" (UID: \"92ab05d2-2b72-4a8b-b649-6802aba4fc7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw" Nov 22 03:45:00 crc kubenswrapper[4952]: I1122 03:45:00.359933 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92ab05d2-2b72-4a8b-b649-6802aba4fc7d-config-volume\") pod \"collect-profiles-29396385-r8zfw\" (UID: \"92ab05d2-2b72-4a8b-b649-6802aba4fc7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw" Nov 22 03:45:00 crc kubenswrapper[4952]: I1122 03:45:00.364642 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92ab05d2-2b72-4a8b-b649-6802aba4fc7d-secret-volume\") pod \"collect-profiles-29396385-r8zfw\" (UID: \"92ab05d2-2b72-4a8b-b649-6802aba4fc7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw" Nov 22 03:45:00 crc kubenswrapper[4952]: I1122 03:45:00.379942 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7zcf\" (UniqueName: \"kubernetes.io/projected/92ab05d2-2b72-4a8b-b649-6802aba4fc7d-kube-api-access-g7zcf\") pod \"collect-profiles-29396385-r8zfw\" (UID: \"92ab05d2-2b72-4a8b-b649-6802aba4fc7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw" Nov 22 03:45:00 crc kubenswrapper[4952]: I1122 03:45:00.482192 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw" Nov 22 03:45:00 crc kubenswrapper[4952]: I1122 03:45:00.964532 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw"] Nov 22 03:45:01 crc kubenswrapper[4952]: I1122 03:45:01.698761 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:45:01 crc kubenswrapper[4952]: I1122 03:45:01.700296 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:45:01 crc kubenswrapper[4952]: I1122 03:45:01.711731 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:45:01 crc kubenswrapper[4952]: I1122 03:45:01.712838 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:45:01 crc kubenswrapper[4952]: I1122 03:45:01.943430 4952 generic.go:334] "Generic (PLEG): container finished" podID="92ab05d2-2b72-4a8b-b649-6802aba4fc7d" containerID="243943688a8d366a7d32df2c915a26d369adbcaaf6dabfbf476f6affc344daa1" exitCode=0 Nov 22 03:45:01 crc kubenswrapper[4952]: I1122 03:45:01.943512 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw" event={"ID":"92ab05d2-2b72-4a8b-b649-6802aba4fc7d","Type":"ContainerDied","Data":"243943688a8d366a7d32df2c915a26d369adbcaaf6dabfbf476f6affc344daa1"} Nov 22 03:45:01 crc kubenswrapper[4952]: I1122 03:45:01.943978 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw" event={"ID":"92ab05d2-2b72-4a8b-b649-6802aba4fc7d","Type":"ContainerStarted","Data":"5c7a37e6a975323d4596b0e72b8159f2dad9fd806a182530a64d549b01a31623"} Nov 22 03:45:03 crc kubenswrapper[4952]: I1122 03:45:03.431126 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw" Nov 22 03:45:03 crc kubenswrapper[4952]: I1122 03:45:03.539896 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92ab05d2-2b72-4a8b-b649-6802aba4fc7d-config-volume\") pod \"92ab05d2-2b72-4a8b-b649-6802aba4fc7d\" (UID: \"92ab05d2-2b72-4a8b-b649-6802aba4fc7d\") " Nov 22 03:45:03 crc kubenswrapper[4952]: I1122 03:45:03.540141 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92ab05d2-2b72-4a8b-b649-6802aba4fc7d-secret-volume\") pod \"92ab05d2-2b72-4a8b-b649-6802aba4fc7d\" (UID: \"92ab05d2-2b72-4a8b-b649-6802aba4fc7d\") " Nov 22 03:45:03 crc kubenswrapper[4952]: I1122 03:45:03.540279 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7zcf\" (UniqueName: \"kubernetes.io/projected/92ab05d2-2b72-4a8b-b649-6802aba4fc7d-kube-api-access-g7zcf\") pod \"92ab05d2-2b72-4a8b-b649-6802aba4fc7d\" (UID: \"92ab05d2-2b72-4a8b-b649-6802aba4fc7d\") " Nov 22 03:45:03 crc kubenswrapper[4952]: I1122 03:45:03.540460 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92ab05d2-2b72-4a8b-b649-6802aba4fc7d-config-volume" (OuterVolumeSpecName: "config-volume") pod "92ab05d2-2b72-4a8b-b649-6802aba4fc7d" (UID: "92ab05d2-2b72-4a8b-b649-6802aba4fc7d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:45:03 crc kubenswrapper[4952]: I1122 03:45:03.540862 4952 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92ab05d2-2b72-4a8b-b649-6802aba4fc7d-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:03 crc kubenswrapper[4952]: I1122 03:45:03.546146 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92ab05d2-2b72-4a8b-b649-6802aba4fc7d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "92ab05d2-2b72-4a8b-b649-6802aba4fc7d" (UID: "92ab05d2-2b72-4a8b-b649-6802aba4fc7d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:03 crc kubenswrapper[4952]: I1122 03:45:03.548741 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92ab05d2-2b72-4a8b-b649-6802aba4fc7d-kube-api-access-g7zcf" (OuterVolumeSpecName: "kube-api-access-g7zcf") pod "92ab05d2-2b72-4a8b-b649-6802aba4fc7d" (UID: "92ab05d2-2b72-4a8b-b649-6802aba4fc7d"). InnerVolumeSpecName "kube-api-access-g7zcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:45:03 crc kubenswrapper[4952]: I1122 03:45:03.643068 4952 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92ab05d2-2b72-4a8b-b649-6802aba4fc7d-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:03 crc kubenswrapper[4952]: I1122 03:45:03.643111 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7zcf\" (UniqueName: \"kubernetes.io/projected/92ab05d2-2b72-4a8b-b649-6802aba4fc7d-kube-api-access-g7zcf\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:03 crc kubenswrapper[4952]: I1122 03:45:03.996420 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw" event={"ID":"92ab05d2-2b72-4a8b-b649-6802aba4fc7d","Type":"ContainerDied","Data":"5c7a37e6a975323d4596b0e72b8159f2dad9fd806a182530a64d549b01a31623"} Nov 22 03:45:03 crc kubenswrapper[4952]: I1122 03:45:03.996462 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c7a37e6a975323d4596b0e72b8159f2dad9fd806a182530a64d549b01a31623" Nov 22 03:45:03 crc kubenswrapper[4952]: I1122 03:45:03.996522 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw" Nov 22 03:45:04 crc kubenswrapper[4952]: I1122 03:45:04.511735 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p"] Nov 22 03:45:04 crc kubenswrapper[4952]: I1122 03:45:04.523350 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396340-ksr4p"] Nov 22 03:45:04 crc kubenswrapper[4952]: I1122 03:45:04.548181 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc1b83df-c857-4b28-beab-c05670fe1b0b" path="/var/lib/kubelet/pods/bc1b83df-c857-4b28-beab-c05670fe1b0b/volumes" Nov 22 03:45:11 crc kubenswrapper[4952]: I1122 03:45:11.083286 4952 generic.go:334] "Generic (PLEG): container finished" podID="701fbc88-a1f6-493d-b6cf-3397073a8d9d" containerID="d1c3fbba899232dfe942c19bcb5664ac1ec804e3b21d27dd8b09313b440ba735" exitCode=0 Nov 22 03:45:11 crc kubenswrapper[4952]: I1122 03:45:11.083647 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-krnxz" event={"ID":"701fbc88-a1f6-493d-b6cf-3397073a8d9d","Type":"ContainerDied","Data":"d1c3fbba899232dfe942c19bcb5664ac1ec804e3b21d27dd8b09313b440ba735"} Nov 22 03:45:11 crc kubenswrapper[4952]: I1122 03:45:11.531223 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:45:11 crc kubenswrapper[4952]: E1122 03:45:11.531752 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:45:12 crc kubenswrapper[4952]: I1122 03:45:12.575993 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-krnxz" Nov 22 03:45:12 crc kubenswrapper[4952]: I1122 03:45:12.661732 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701fbc88-a1f6-493d-b6cf-3397073a8d9d-config-data\") pod \"701fbc88-a1f6-493d-b6cf-3397073a8d9d\" (UID: \"701fbc88-a1f6-493d-b6cf-3397073a8d9d\") " Nov 22 03:45:12 crc kubenswrapper[4952]: I1122 03:45:12.662330 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701fbc88-a1f6-493d-b6cf-3397073a8d9d-combined-ca-bundle\") pod \"701fbc88-a1f6-493d-b6cf-3397073a8d9d\" (UID: \"701fbc88-a1f6-493d-b6cf-3397073a8d9d\") " Nov 22 03:45:12 crc kubenswrapper[4952]: I1122 03:45:12.662399 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd9z6\" (UniqueName: \"kubernetes.io/projected/701fbc88-a1f6-493d-b6cf-3397073a8d9d-kube-api-access-qd9z6\") pod \"701fbc88-a1f6-493d-b6cf-3397073a8d9d\" (UID: \"701fbc88-a1f6-493d-b6cf-3397073a8d9d\") " Nov 22 03:45:12 crc kubenswrapper[4952]: I1122 03:45:12.662471 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/701fbc88-a1f6-493d-b6cf-3397073a8d9d-job-config-data\") pod \"701fbc88-a1f6-493d-b6cf-3397073a8d9d\" (UID: \"701fbc88-a1f6-493d-b6cf-3397073a8d9d\") " Nov 22 03:45:12 crc kubenswrapper[4952]: I1122 03:45:12.670038 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/701fbc88-a1f6-493d-b6cf-3397073a8d9d-kube-api-access-qd9z6" (OuterVolumeSpecName: "kube-api-access-qd9z6") pod "701fbc88-a1f6-493d-b6cf-3397073a8d9d" (UID: "701fbc88-a1f6-493d-b6cf-3397073a8d9d"). InnerVolumeSpecName "kube-api-access-qd9z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:45:12 crc kubenswrapper[4952]: I1122 03:45:12.671842 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701fbc88-a1f6-493d-b6cf-3397073a8d9d-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "701fbc88-a1f6-493d-b6cf-3397073a8d9d" (UID: "701fbc88-a1f6-493d-b6cf-3397073a8d9d"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:12 crc kubenswrapper[4952]: I1122 03:45:12.675659 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701fbc88-a1f6-493d-b6cf-3397073a8d9d-config-data" (OuterVolumeSpecName: "config-data") pod "701fbc88-a1f6-493d-b6cf-3397073a8d9d" (UID: "701fbc88-a1f6-493d-b6cf-3397073a8d9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:12 crc kubenswrapper[4952]: I1122 03:45:12.703783 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701fbc88-a1f6-493d-b6cf-3397073a8d9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "701fbc88-a1f6-493d-b6cf-3397073a8d9d" (UID: "701fbc88-a1f6-493d-b6cf-3397073a8d9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:12 crc kubenswrapper[4952]: I1122 03:45:12.765990 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701fbc88-a1f6-493d-b6cf-3397073a8d9d-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:12 crc kubenswrapper[4952]: I1122 03:45:12.766028 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701fbc88-a1f6-493d-b6cf-3397073a8d9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:12 crc kubenswrapper[4952]: I1122 03:45:12.766044 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd9z6\" (UniqueName: \"kubernetes.io/projected/701fbc88-a1f6-493d-b6cf-3397073a8d9d-kube-api-access-qd9z6\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:12 crc kubenswrapper[4952]: I1122 03:45:12.766059 4952 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/701fbc88-a1f6-493d-b6cf-3397073a8d9d-job-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.113973 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-krnxz" event={"ID":"701fbc88-a1f6-493d-b6cf-3397073a8d9d","Type":"ContainerDied","Data":"8a2ac23f53435e6ac62a0ba8c8f0c99e42ca7cb88fc33555b4e96e9f64996cc4"} Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.114051 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a2ac23f53435e6ac62a0ba8c8f0c99e42ca7cb88fc33555b4e96e9f64996cc4" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.114158 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-krnxz" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.417043 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.604221 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 03:45:13 crc kubenswrapper[4952]: E1122 03:45:13.604940 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701fbc88-a1f6-493d-b6cf-3397073a8d9d" containerName="manila-db-sync" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.604952 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="701fbc88-a1f6-493d-b6cf-3397073a8d9d" containerName="manila-db-sync" Nov 22 03:45:13 crc kubenswrapper[4952]: E1122 03:45:13.604989 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92ab05d2-2b72-4a8b-b649-6802aba4fc7d" containerName="collect-profiles" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.604995 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="92ab05d2-2b72-4a8b-b649-6802aba4fc7d" containerName="collect-profiles" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.605198 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="701fbc88-a1f6-493d-b6cf-3397073a8d9d" containerName="manila-db-sync" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.605216 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="92ab05d2-2b72-4a8b-b649-6802aba4fc7d" containerName="collect-profiles" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.606267 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.615502 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.615519 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-wds6g" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.615719 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.616068 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.636632 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.638535 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.646326 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.646588 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.668100 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.689389 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-wzpbj"] Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.691103 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.696005 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5f2b58e-2b8d-4efa-b37c-77717d271276-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.696076 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.696098 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-config-data\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.696115 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e5f2b58e-2b8d-4efa-b37c-77717d271276-ceph\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.696157 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-scripts\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.696182 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/511d091c-af24-4531-b382-304c3ee5ecff-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.696264 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.696293 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-config-data\") pod \"manila-scheduler-0\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.696346 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-scripts\") pod \"manila-scheduler-0\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.696362 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.696392 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/e5f2b58e-2b8d-4efa-b37c-77717d271276-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.696410 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mr7z\" (UniqueName: \"kubernetes.io/projected/511d091c-af24-4531-b382-304c3ee5ecff-kube-api-access-9mr7z\") pod \"manila-scheduler-0\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.696446 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.696461 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wj6v\" (UniqueName: \"kubernetes.io/projected/e5f2b58e-2b8d-4efa-b37c-77717d271276-kube-api-access-5wj6v\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.703015 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-wzpbj"] Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.724145 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.777334 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.779641 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.786893 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.787349 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.799167 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mr7z\" (UniqueName: \"kubernetes.io/projected/511d091c-af24-4531-b382-304c3ee5ecff-kube-api-access-9mr7z\") pod \"manila-scheduler-0\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.799250 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-wzpbj\" (UID: \"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7\") " pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.799283 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.799304 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wj6v\" (UniqueName: \"kubernetes.io/projected/e5f2b58e-2b8d-4efa-b37c-77717d271276-kube-api-access-5wj6v\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.799345 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-wzpbj\" (UID: \"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7\") " pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.799362 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5f2b58e-2b8d-4efa-b37c-77717d271276-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.799390 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.799407 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-config-data\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.799423 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e5f2b58e-2b8d-4efa-b37c-77717d271276-ceph\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.799438 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-scripts\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.799454 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-wzpbj\" (UID: \"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7\") " pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.799470 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/511d091c-af24-4531-b382-304c3ee5ecff-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.799511 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.799537 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-wzpbj\" (UID: \"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7\") " pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.799585 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-config-data\") pod \"manila-scheduler-0\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.799607 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7-config\") pod \"dnsmasq-dns-76b5fdb995-wzpbj\" (UID: \"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7\") " pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.799670 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-scripts\") pod \"manila-scheduler-0\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.799689 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.799710 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxll6\" (UniqueName: \"kubernetes.io/projected/91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7-kube-api-access-bxll6\") pod \"dnsmasq-dns-76b5fdb995-wzpbj\" (UID: \"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7\") " pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.799744 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/e5f2b58e-2b8d-4efa-b37c-77717d271276-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.800492 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/e5f2b58e-2b8d-4efa-b37c-77717d271276-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.800851 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/511d091c-af24-4531-b382-304c3ee5ecff-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.814277 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-scripts\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.814973 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-scripts\") pod \"manila-scheduler-0\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.815710 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.815759 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5f2b58e-2b8d-4efa-b37c-77717d271276-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.816691 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-config-data\") pod \"manila-scheduler-0\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.823308 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.824164 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.824797 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e5f2b58e-2b8d-4efa-b37c-77717d271276-ceph\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.832248 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-config-data\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.841237 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wj6v\" (UniqueName: \"kubernetes.io/projected/e5f2b58e-2b8d-4efa-b37c-77717d271276-kube-api-access-5wj6v\") pod \"manila-share-share1-0\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.841841 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.842189 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mr7z\" (UniqueName: \"kubernetes.io/projected/511d091c-af24-4531-b382-304c3ee5ecff-kube-api-access-9mr7z\") pod \"manila-scheduler-0\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.901987 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxll6\" (UniqueName: \"kubernetes.io/projected/91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7-kube-api-access-bxll6\") pod \"dnsmasq-dns-76b5fdb995-wzpbj\" (UID: \"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7\") " pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.902124 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-wzpbj\" (UID: \"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7\") " pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.902192 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44b3c7de-e46e-45ce-aec3-090a38e347cf-etc-machine-id\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.902231 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-wzpbj\" (UID: \"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7\") " pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.902277 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.902311 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-wzpbj\" (UID: \"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7\") " pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.902352 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txbfj\" (UniqueName: \"kubernetes.io/projected/44b3c7de-e46e-45ce-aec3-090a38e347cf-kube-api-access-txbfj\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.902390 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44b3c7de-e46e-45ce-aec3-090a38e347cf-logs\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.902442 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-wzpbj\" (UID: \"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7\") " pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.902486 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7-config\") pod \"dnsmasq-dns-76b5fdb995-wzpbj\" (UID: \"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7\") " pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.902625 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-scripts\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.904416 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-wzpbj\" (UID: \"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7\") " pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.904802 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-wzpbj\" (UID: \"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7\") " pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.904869 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-config-data\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.904916 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-config-data-custom\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.905605 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7-config\") pod \"dnsmasq-dns-76b5fdb995-wzpbj\" (UID: \"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7\") " pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.905656 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-wzpbj\" (UID: \"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7\") " pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.905984 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-wzpbj\" (UID: \"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7\") " pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.925949 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxll6\" (UniqueName: \"kubernetes.io/projected/91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7-kube-api-access-bxll6\") pod \"dnsmasq-dns-76b5fdb995-wzpbj\" (UID: \"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7\") " pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.926349 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 22 03:45:13 crc kubenswrapper[4952]: I1122 03:45:13.979782 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 22 03:45:14 crc kubenswrapper[4952]: I1122 03:45:14.007049 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txbfj\" (UniqueName: \"kubernetes.io/projected/44b3c7de-e46e-45ce-aec3-090a38e347cf-kube-api-access-txbfj\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:14 crc kubenswrapper[4952]: I1122 03:45:14.007317 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44b3c7de-e46e-45ce-aec3-090a38e347cf-logs\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:14 crc kubenswrapper[4952]: I1122 03:45:14.007486 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-scripts\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:14 crc kubenswrapper[4952]: I1122 03:45:14.007675 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-config-data\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:14 crc kubenswrapper[4952]: I1122 03:45:14.007790 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-config-data-custom\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:14 crc kubenswrapper[4952]: I1122 03:45:14.007979 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44b3c7de-e46e-45ce-aec3-090a38e347cf-logs\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:14 crc kubenswrapper[4952]: I1122 03:45:14.008179 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44b3c7de-e46e-45ce-aec3-090a38e347cf-etc-machine-id\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:14 crc kubenswrapper[4952]: I1122 03:45:14.008318 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:14 crc kubenswrapper[4952]: I1122 03:45:14.009520 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44b3c7de-e46e-45ce-aec3-090a38e347cf-etc-machine-id\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:14 crc kubenswrapper[4952]: I1122 03:45:14.012828 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-config-data-custom\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:14 crc kubenswrapper[4952]: I1122 03:45:14.015598 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:14 crc kubenswrapper[4952]: I1122 03:45:14.021363 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:14 crc kubenswrapper[4952]: I1122 03:45:14.029096 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-scripts\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:14 crc kubenswrapper[4952]: I1122 03:45:14.035013 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-config-data\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:14 crc kubenswrapper[4952]: I1122 03:45:14.042103 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txbfj\" (UniqueName: \"kubernetes.io/projected/44b3c7de-e46e-45ce-aec3-090a38e347cf-kube-api-access-txbfj\") pod \"manila-api-0\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " pod="openstack/manila-api-0" Nov 22 03:45:14 crc kubenswrapper[4952]: I1122 03:45:14.105909 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 22 03:45:14 crc kubenswrapper[4952]: I1122 03:45:14.462079 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 03:45:14 crc kubenswrapper[4952]: I1122 03:45:14.628280 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-wzpbj"] Nov 22 03:45:14 crc kubenswrapper[4952]: I1122 03:45:14.773993 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 03:45:14 crc kubenswrapper[4952]: I1122 03:45:14.922881 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 22 03:45:15 crc kubenswrapper[4952]: I1122 03:45:15.173363 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" event={"ID":"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7","Type":"ContainerStarted","Data":"fa0e144519e319d54a40948b7194467e264ecab1f4a4d9578553e1a4f72c62e3"} Nov 22 03:45:15 crc kubenswrapper[4952]: I1122 03:45:15.182769 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"44b3c7de-e46e-45ce-aec3-090a38e347cf","Type":"ContainerStarted","Data":"84b529c416033f2482559d57f27642d1e87e52b0e37f97ca279b0c632aa4c4ee"} Nov 22 03:45:15 crc kubenswrapper[4952]: I1122 03:45:15.184229 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"e5f2b58e-2b8d-4efa-b37c-77717d271276","Type":"ContainerStarted","Data":"87a7cbe3cb60ea215332f4a30105294dc71d3955c86ad80db23150f7a57225ff"} Nov 22 03:45:15 crc kubenswrapper[4952]: I1122 03:45:15.185311 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"511d091c-af24-4531-b382-304c3ee5ecff","Type":"ContainerStarted","Data":"b647dea27fc1e995c37a1425c6136abeb1962947a8ceed4303df2c8be636149b"} Nov 22 03:45:15 crc kubenswrapper[4952]: I1122 03:45:15.989857 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-56848f9f44-m7z42" Nov 22 03:45:16 crc kubenswrapper[4952]: I1122 03:45:16.065809 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5675778f5b-wg7px"] Nov 22 03:45:16 crc kubenswrapper[4952]: I1122 03:45:16.066156 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5675778f5b-wg7px" podUID="d8eb2643-4d7c-4814-91fe-1192d3fc753d" containerName="horizon-log" containerID="cri-o://fcb3f4c158aff30431cd1c8359c3e7a82cf2d483250cad37132024d0df9ad257" gracePeriod=30 Nov 22 03:45:16 crc kubenswrapper[4952]: I1122 03:45:16.066509 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5675778f5b-wg7px" podUID="d8eb2643-4d7c-4814-91fe-1192d3fc753d" containerName="horizon" containerID="cri-o://3048b2f97eb28bedd926106b8d79a7305914bbde006d9e89768a2659d575f71f" gracePeriod=30 Nov 22 03:45:16 crc kubenswrapper[4952]: I1122 03:45:16.089619 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5675778f5b-wg7px" podUID="d8eb2643-4d7c-4814-91fe-1192d3fc753d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.240:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Nov 22 03:45:16 crc kubenswrapper[4952]: I1122 03:45:16.240790 4952 generic.go:334] "Generic (PLEG): container finished" podID="91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7" containerID="ab1c56e3a0046374458e9df7f0482da52a23178e4efbec5638246cf8709bcf5d" exitCode=0 Nov 22 03:45:16 crc kubenswrapper[4952]: I1122 03:45:16.240988 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" event={"ID":"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7","Type":"ContainerDied","Data":"ab1c56e3a0046374458e9df7f0482da52a23178e4efbec5638246cf8709bcf5d"} Nov 22 03:45:16 crc kubenswrapper[4952]: I1122 03:45:16.251140 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"44b3c7de-e46e-45ce-aec3-090a38e347cf","Type":"ContainerStarted","Data":"79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b"} Nov 22 03:45:16 crc kubenswrapper[4952]: I1122 03:45:16.253689 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"511d091c-af24-4531-b382-304c3ee5ecff","Type":"ContainerStarted","Data":"a2204f983935a8056dc5ff1c9ba5b7f67a88561af2d2b80fd7854d064b1cdcb0"} Nov 22 03:45:16 crc kubenswrapper[4952]: I1122 03:45:16.858460 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Nov 22 03:45:17 crc kubenswrapper[4952]: I1122 03:45:17.265226 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"511d091c-af24-4531-b382-304c3ee5ecff","Type":"ContainerStarted","Data":"71eaa9a82a8ad905bc424adfc104d52070a492c3f97b99f196f061834c2de3bf"} Nov 22 03:45:17 crc kubenswrapper[4952]: I1122 03:45:17.268658 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" event={"ID":"91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7","Type":"ContainerStarted","Data":"86fd86c7a45d482699032c82cb75469b57bec916f392e6bdfd264a8a1705045e"} Nov 22 03:45:17 crc kubenswrapper[4952]: I1122 03:45:17.268870 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:17 crc kubenswrapper[4952]: I1122 03:45:17.271155 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"44b3c7de-e46e-45ce-aec3-090a38e347cf","Type":"ContainerStarted","Data":"b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4"} Nov 22 03:45:17 crc kubenswrapper[4952]: I1122 03:45:17.271676 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Nov 22 03:45:17 crc kubenswrapper[4952]: I1122 03:45:17.287088 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.370177197 podStartE2EDuration="4.287070753s" podCreationTimestamp="2025-11-22 03:45:13 +0000 UTC" firstStartedPulling="2025-11-22 03:45:14.475220965 +0000 UTC m=+3078.781238238" lastFinishedPulling="2025-11-22 03:45:15.392114521 +0000 UTC m=+3079.698131794" observedRunningTime="2025-11-22 03:45:17.284633698 +0000 UTC m=+3081.590650971" watchObservedRunningTime="2025-11-22 03:45:17.287070753 +0000 UTC m=+3081.593088026" Nov 22 03:45:17 crc kubenswrapper[4952]: I1122 03:45:17.306211 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" podStartSLOduration=4.30619588 podStartE2EDuration="4.30619588s" podCreationTimestamp="2025-11-22 03:45:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:45:17.305467381 +0000 UTC m=+3081.611484664" watchObservedRunningTime="2025-11-22 03:45:17.30619588 +0000 UTC m=+3081.612213153" Nov 22 03:45:17 crc kubenswrapper[4952]: I1122 03:45:17.321742 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.321722332 podStartE2EDuration="4.321722332s" podCreationTimestamp="2025-11-22 03:45:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:45:17.31938181 +0000 UTC m=+3081.625399083" watchObservedRunningTime="2025-11-22 03:45:17.321722332 +0000 UTC m=+3081.627739595" Nov 22 03:45:18 crc kubenswrapper[4952]: I1122 03:45:18.285526 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="44b3c7de-e46e-45ce-aec3-090a38e347cf" containerName="manila-api-log" containerID="cri-o://79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b" gracePeriod=30 Nov 22 03:45:18 crc kubenswrapper[4952]: I1122 03:45:18.286497 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="44b3c7de-e46e-45ce-aec3-090a38e347cf" containerName="manila-api" containerID="cri-o://b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4" gracePeriod=30 Nov 22 03:45:18 crc kubenswrapper[4952]: I1122 03:45:18.981806 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.140074 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44b3c7de-e46e-45ce-aec3-090a38e347cf-logs\") pod \"44b3c7de-e46e-45ce-aec3-090a38e347cf\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.140266 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-config-data-custom\") pod \"44b3c7de-e46e-45ce-aec3-090a38e347cf\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.140308 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txbfj\" (UniqueName: \"kubernetes.io/projected/44b3c7de-e46e-45ce-aec3-090a38e347cf-kube-api-access-txbfj\") pod \"44b3c7de-e46e-45ce-aec3-090a38e347cf\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.140342 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-combined-ca-bundle\") pod \"44b3c7de-e46e-45ce-aec3-090a38e347cf\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.140402 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-scripts\") pod \"44b3c7de-e46e-45ce-aec3-090a38e347cf\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.140450 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-config-data\") pod \"44b3c7de-e46e-45ce-aec3-090a38e347cf\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.140491 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44b3c7de-e46e-45ce-aec3-090a38e347cf-etc-machine-id\") pod \"44b3c7de-e46e-45ce-aec3-090a38e347cf\" (UID: \"44b3c7de-e46e-45ce-aec3-090a38e347cf\") " Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.140560 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44b3c7de-e46e-45ce-aec3-090a38e347cf-logs" (OuterVolumeSpecName: "logs") pod "44b3c7de-e46e-45ce-aec3-090a38e347cf" (UID: "44b3c7de-e46e-45ce-aec3-090a38e347cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.140682 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44b3c7de-e46e-45ce-aec3-090a38e347cf-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "44b3c7de-e46e-45ce-aec3-090a38e347cf" (UID: "44b3c7de-e46e-45ce-aec3-090a38e347cf"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.141397 4952 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44b3c7de-e46e-45ce-aec3-090a38e347cf-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.141612 4952 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44b3c7de-e46e-45ce-aec3-090a38e347cf-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.148926 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44b3c7de-e46e-45ce-aec3-090a38e347cf-kube-api-access-txbfj" (OuterVolumeSpecName: "kube-api-access-txbfj") pod "44b3c7de-e46e-45ce-aec3-090a38e347cf" (UID: "44b3c7de-e46e-45ce-aec3-090a38e347cf"). InnerVolumeSpecName "kube-api-access-txbfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.149603 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-scripts" (OuterVolumeSpecName: "scripts") pod "44b3c7de-e46e-45ce-aec3-090a38e347cf" (UID: "44b3c7de-e46e-45ce-aec3-090a38e347cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.149698 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "44b3c7de-e46e-45ce-aec3-090a38e347cf" (UID: "44b3c7de-e46e-45ce-aec3-090a38e347cf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.183374 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44b3c7de-e46e-45ce-aec3-090a38e347cf" (UID: "44b3c7de-e46e-45ce-aec3-090a38e347cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.213832 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-config-data" (OuterVolumeSpecName: "config-data") pod "44b3c7de-e46e-45ce-aec3-090a38e347cf" (UID: "44b3c7de-e46e-45ce-aec3-090a38e347cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.244152 4952 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.244197 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txbfj\" (UniqueName: \"kubernetes.io/projected/44b3c7de-e46e-45ce-aec3-090a38e347cf-kube-api-access-txbfj\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.244213 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.244225 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.244239 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b3c7de-e46e-45ce-aec3-090a38e347cf-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.291152 4952 generic.go:334] "Generic (PLEG): container finished" podID="44b3c7de-e46e-45ce-aec3-090a38e347cf" containerID="b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4" exitCode=0 Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.291184 4952 generic.go:334] "Generic (PLEG): container finished" podID="44b3c7de-e46e-45ce-aec3-090a38e347cf" containerID="79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b" exitCode=143 Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.291213 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.291210 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"44b3c7de-e46e-45ce-aec3-090a38e347cf","Type":"ContainerDied","Data":"b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4"} Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.291356 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"44b3c7de-e46e-45ce-aec3-090a38e347cf","Type":"ContainerDied","Data":"79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b"} Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.291379 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"44b3c7de-e46e-45ce-aec3-090a38e347cf","Type":"ContainerDied","Data":"84b529c416033f2482559d57f27642d1e87e52b0e37f97ca279b0c632aa4c4ee"} Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.291394 4952 scope.go:117] "RemoveContainer" containerID="b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.318337 4952 scope.go:117] "RemoveContainer" containerID="79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.326259 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.336448 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.343393 4952 scope.go:117] "RemoveContainer" containerID="b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4" Nov 22 03:45:19 crc kubenswrapper[4952]: E1122 03:45:19.344086 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4\": container with ID starting with b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4 not found: ID does not exist" containerID="b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.344127 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4"} err="failed to get container status \"b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4\": rpc error: code = NotFound desc = could not find container \"b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4\": container with ID starting with b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4 not found: ID does not exist" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.344153 4952 scope.go:117] "RemoveContainer" containerID="79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b" Nov 22 03:45:19 crc kubenswrapper[4952]: E1122 03:45:19.344588 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b\": container with ID starting with 79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b not found: ID does not exist" containerID="79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.344618 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b"} err="failed to get container status \"79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b\": rpc error: code = NotFound desc = could not find container \"79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b\": container with ID starting with 79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b not found: ID does not exist" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.344635 4952 scope.go:117] "RemoveContainer" containerID="b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.345051 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4"} err="failed to get container status \"b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4\": rpc error: code = NotFound desc = could not find container \"b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4\": container with ID starting with b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4 not found: ID does not exist" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.345089 4952 scope.go:117] "RemoveContainer" containerID="79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.345625 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b"} err="failed to get container status \"79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b\": rpc error: code = NotFound desc = could not find container \"79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b\": container with ID starting with 79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b not found: ID does not exist" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.363893 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Nov 22 03:45:19 crc kubenswrapper[4952]: E1122 03:45:19.364649 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b3c7de-e46e-45ce-aec3-090a38e347cf" containerName="manila-api" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.364676 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b3c7de-e46e-45ce-aec3-090a38e347cf" containerName="manila-api" Nov 22 03:45:19 crc kubenswrapper[4952]: E1122 03:45:19.364699 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b3c7de-e46e-45ce-aec3-090a38e347cf" containerName="manila-api-log" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.364735 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b3c7de-e46e-45ce-aec3-090a38e347cf" containerName="manila-api-log" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.365085 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="44b3c7de-e46e-45ce-aec3-090a38e347cf" containerName="manila-api" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.365110 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="44b3c7de-e46e-45ce-aec3-090a38e347cf" containerName="manila-api-log" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.366587 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.370533 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.371853 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.372066 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.382644 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.450378 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa384b1-1586-4709-b258-def203cac8f5-public-tls-certs\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.450450 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7aa384b1-1586-4709-b258-def203cac8f5-etc-machine-id\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.450496 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7aa384b1-1586-4709-b258-def203cac8f5-config-data-custom\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.450518 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7aa384b1-1586-4709-b258-def203cac8f5-logs\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.450572 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa384b1-1586-4709-b258-def203cac8f5-internal-tls-certs\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.450604 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rptc\" (UniqueName: \"kubernetes.io/projected/7aa384b1-1586-4709-b258-def203cac8f5-kube-api-access-5rptc\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.450618 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa384b1-1586-4709-b258-def203cac8f5-scripts\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.450651 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa384b1-1586-4709-b258-def203cac8f5-config-data\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.450670 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa384b1-1586-4709-b258-def203cac8f5-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.552898 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa384b1-1586-4709-b258-def203cac8f5-config-data\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.552952 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa384b1-1586-4709-b258-def203cac8f5-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.553027 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa384b1-1586-4709-b258-def203cac8f5-public-tls-certs\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.553085 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7aa384b1-1586-4709-b258-def203cac8f5-etc-machine-id\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.553144 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7aa384b1-1586-4709-b258-def203cac8f5-config-data-custom\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.553173 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7aa384b1-1586-4709-b258-def203cac8f5-logs\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.553224 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa384b1-1586-4709-b258-def203cac8f5-internal-tls-certs\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.553273 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa384b1-1586-4709-b258-def203cac8f5-scripts\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.553270 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7aa384b1-1586-4709-b258-def203cac8f5-etc-machine-id\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.553295 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rptc\" (UniqueName: \"kubernetes.io/projected/7aa384b1-1586-4709-b258-def203cac8f5-kube-api-access-5rptc\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.553782 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7aa384b1-1586-4709-b258-def203cac8f5-logs\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.557612 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa384b1-1586-4709-b258-def203cac8f5-internal-tls-certs\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.557691 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa384b1-1586-4709-b258-def203cac8f5-config-data\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.558023 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa384b1-1586-4709-b258-def203cac8f5-scripts\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.558315 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7aa384b1-1586-4709-b258-def203cac8f5-config-data-custom\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.560114 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa384b1-1586-4709-b258-def203cac8f5-public-tls-certs\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.561441 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa384b1-1586-4709-b258-def203cac8f5-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.572315 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rptc\" (UniqueName: \"kubernetes.io/projected/7aa384b1-1586-4709-b258-def203cac8f5-kube-api-access-5rptc\") pod \"manila-api-0\" (UID: \"7aa384b1-1586-4709-b258-def203cac8f5\") " pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.786962 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.787516 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.787901 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerName="proxy-httpd" containerID="cri-o://c0b04bebd9ed531aec28fd430adc6d44e23000a227a75e35465b6c1c4c47b4a9" gracePeriod=30 Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.787962 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerName="sg-core" containerID="cri-o://803b394efea0353c6dcc4b5227756459a0f676ad763712060246045d990e8ad4" gracePeriod=30 Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.788003 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerName="ceilometer-notification-agent" containerID="cri-o://fadc909b8429c89e3bd89b21d26568ab55aaa449f6180242f3547974616e3131" gracePeriod=30 Nov 22 03:45:19 crc kubenswrapper[4952]: I1122 03:45:19.788061 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerName="ceilometer-central-agent" containerID="cri-o://ffcb9e1580e370ee0cb31f4463f1da04ab31cfd53007ae164da8e3026ec9e621" gracePeriod=30 Nov 22 03:45:20 crc kubenswrapper[4952]: I1122 03:45:20.308246 4952 generic.go:334] "Generic (PLEG): container finished" podID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerID="c0b04bebd9ed531aec28fd430adc6d44e23000a227a75e35465b6c1c4c47b4a9" exitCode=0 Nov 22 03:45:20 crc kubenswrapper[4952]: I1122 03:45:20.308651 4952 generic.go:334] "Generic (PLEG): container finished" podID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerID="803b394efea0353c6dcc4b5227756459a0f676ad763712060246045d990e8ad4" exitCode=2 Nov 22 03:45:20 crc kubenswrapper[4952]: I1122 03:45:20.308708 4952 generic.go:334] "Generic (PLEG): container finished" podID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerID="ffcb9e1580e370ee0cb31f4463f1da04ab31cfd53007ae164da8e3026ec9e621" exitCode=0 Nov 22 03:45:20 crc kubenswrapper[4952]: I1122 03:45:20.308334 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50403f1a-ba57-4c1f-86a5-67269195ca65","Type":"ContainerDied","Data":"c0b04bebd9ed531aec28fd430adc6d44e23000a227a75e35465b6c1c4c47b4a9"} Nov 22 03:45:20 crc kubenswrapper[4952]: I1122 03:45:20.308757 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50403f1a-ba57-4c1f-86a5-67269195ca65","Type":"ContainerDied","Data":"803b394efea0353c6dcc4b5227756459a0f676ad763712060246045d990e8ad4"} Nov 22 03:45:20 crc kubenswrapper[4952]: I1122 03:45:20.308779 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50403f1a-ba57-4c1f-86a5-67269195ca65","Type":"ContainerDied","Data":"ffcb9e1580e370ee0cb31f4463f1da04ab31cfd53007ae164da8e3026ec9e621"} Nov 22 03:45:20 crc kubenswrapper[4952]: I1122 03:45:20.473162 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5675778f5b-wg7px" podUID="d8eb2643-4d7c-4814-91fe-1192d3fc753d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.240:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:33546->10.217.0.240:8443: read: connection reset by peer" Nov 22 03:45:20 crc kubenswrapper[4952]: I1122 03:45:20.548315 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44b3c7de-e46e-45ce-aec3-090a38e347cf" path="/var/lib/kubelet/pods/44b3c7de-e46e-45ce-aec3-090a38e347cf/volumes" Nov 22 03:45:21 crc kubenswrapper[4952]: I1122 03:45:21.320429 4952 generic.go:334] "Generic (PLEG): container finished" podID="d8eb2643-4d7c-4814-91fe-1192d3fc753d" containerID="3048b2f97eb28bedd926106b8d79a7305914bbde006d9e89768a2659d575f71f" exitCode=0 Nov 22 03:45:21 crc kubenswrapper[4952]: I1122 03:45:21.320470 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5675778f5b-wg7px" event={"ID":"d8eb2643-4d7c-4814-91fe-1192d3fc753d","Type":"ContainerDied","Data":"3048b2f97eb28bedd926106b8d79a7305914bbde006d9e89768a2659d575f71f"} Nov 22 03:45:21 crc kubenswrapper[4952]: I1122 03:45:21.698166 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5675778f5b-wg7px" podUID="d8eb2643-4d7c-4814-91fe-1192d3fc753d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.240:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.240:8443: connect: connection refused" Nov 22 03:45:22 crc kubenswrapper[4952]: I1122 03:45:22.045369 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.182:3000/\": dial tcp 10.217.0.182:3000: connect: connection refused" Nov 22 03:45:22 crc kubenswrapper[4952]: I1122 03:45:22.534472 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:45:22 crc kubenswrapper[4952]: E1122 03:45:22.534935 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:45:22 crc kubenswrapper[4952]: W1122 03:45:22.876925 4952 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44b3c7de_e46e_45ce_aec3_090a38e347cf.slice/crio-84b529c416033f2482559d57f27642d1e87e52b0e37f97ca279b0c632aa4c4ee": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44b3c7de_e46e_45ce_aec3_090a38e347cf.slice/crio-84b529c416033f2482559d57f27642d1e87e52b0e37f97ca279b0c632aa4c4ee: no such file or directory Nov 22 03:45:22 crc kubenswrapper[4952]: W1122 03:45:22.877164 4952 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91b94dc0_c7d3_4bc0_8e87_fb387ed6d9a7.slice/crio-conmon-ab1c56e3a0046374458e9df7f0482da52a23178e4efbec5638246cf8709bcf5d.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91b94dc0_c7d3_4bc0_8e87_fb387ed6d9a7.slice/crio-conmon-ab1c56e3a0046374458e9df7f0482da52a23178e4efbec5638246cf8709bcf5d.scope: no such file or directory Nov 22 03:45:22 crc kubenswrapper[4952]: W1122 03:45:22.877197 4952 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91b94dc0_c7d3_4bc0_8e87_fb387ed6d9a7.slice/crio-ab1c56e3a0046374458e9df7f0482da52a23178e4efbec5638246cf8709bcf5d.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91b94dc0_c7d3_4bc0_8e87_fb387ed6d9a7.slice/crio-ab1c56e3a0046374458e9df7f0482da52a23178e4efbec5638246cf8709bcf5d.scope: no such file or directory Nov 22 03:45:22 crc kubenswrapper[4952]: W1122 03:45:22.877222 4952 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44b3c7de_e46e_45ce_aec3_090a38e347cf.slice/crio-conmon-79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44b3c7de_e46e_45ce_aec3_090a38e347cf.slice/crio-conmon-79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b.scope: no such file or directory Nov 22 03:45:22 crc kubenswrapper[4952]: W1122 03:45:22.877238 4952 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44b3c7de_e46e_45ce_aec3_090a38e347cf.slice/crio-79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44b3c7de_e46e_45ce_aec3_090a38e347cf.slice/crio-79a79bfa0f5b1d71cbf20146835e8266c66995eb19ffaf890b3e4981cab0b55b.scope: no such file or directory Nov 22 03:45:22 crc kubenswrapper[4952]: W1122 03:45:22.878758 4952 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44b3c7de_e46e_45ce_aec3_090a38e347cf.slice/crio-conmon-b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44b3c7de_e46e_45ce_aec3_090a38e347cf.slice/crio-conmon-b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4.scope: no such file or directory Nov 22 03:45:22 crc kubenswrapper[4952]: W1122 03:45:22.878824 4952 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44b3c7de_e46e_45ce_aec3_090a38e347cf.slice/crio-b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44b3c7de_e46e_45ce_aec3_090a38e347cf.slice/crio-b9972626e5c7f481bb9811a1b3001bc4e8689281ffd5eafce2671cd76d96bbe4.scope: no such file or directory Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.009849 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 22 03:45:23 crc kubenswrapper[4952]: W1122 03:45:23.076251 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7aa384b1_1586_4709_b258_def203cac8f5.slice/crio-403f23497628c8dcb0e51623514765c140b5e02073dc8291c056ef8ee00c2219 WatchSource:0}: Error finding container 403f23497628c8dcb0e51623514765c140b5e02073dc8291c056ef8ee00c2219: Status 404 returned error can't find the container with id 403f23497628c8dcb0e51623514765c140b5e02073dc8291c056ef8ee00c2219 Nov 22 03:45:23 crc kubenswrapper[4952]: E1122 03:45:23.234554 4952 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17666473_50ea_48ef_afc8_265daec6df33.slice/crio-conmon-b0b99556c4447ad4ea24d77abc7a4a125cea9dff413425d71d0913f88307cd99.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50403f1a_ba57_4c1f_86a5_67269195ca65.slice/crio-conmon-803b394efea0353c6dcc4b5227756459a0f676ad763712060246045d990e8ad4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd455bb45_949a_444f_bf5f_61736fbe9c28.slice/crio-conmon-78fcf1edf83404e2d54eb637ecfc495cc2dfbfac93f2f7dd76ac810b4f72b4c6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50403f1a_ba57_4c1f_86a5_67269195ca65.slice/crio-ffcb9e1580e370ee0cb31f4463f1da04ab31cfd53007ae164da8e3026ec9e621.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50403f1a_ba57_4c1f_86a5_67269195ca65.slice/crio-conmon-c0b04bebd9ed531aec28fd430adc6d44e23000a227a75e35465b6c1c4c47b4a9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50403f1a_ba57_4c1f_86a5_67269195ca65.slice/crio-c0b04bebd9ed531aec28fd430adc6d44e23000a227a75e35465b6c1c4c47b4a9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17666473_50ea_48ef_afc8_265daec6df33.slice/crio-373d0be2b365981de93ade81dad242a9026128931cdac56197415316f0e80264.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50403f1a_ba57_4c1f_86a5_67269195ca65.slice/crio-conmon-ffcb9e1580e370ee0cb31f4463f1da04ab31cfd53007ae164da8e3026ec9e621.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8eb2643_4d7c_4814_91fe_1192d3fc753d.slice/crio-conmon-3048b2f97eb28bedd926106b8d79a7305914bbde006d9e89768a2659d575f71f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8eb2643_4d7c_4814_91fe_1192d3fc753d.slice/crio-3048b2f97eb28bedd926106b8d79a7305914bbde006d9e89768a2659d575f71f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd455bb45_949a_444f_bf5f_61736fbe9c28.slice/crio-78fcf1edf83404e2d54eb637ecfc495cc2dfbfac93f2f7dd76ac810b4f72b4c6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17666473_50ea_48ef_afc8_265daec6df33.slice/crio-conmon-373d0be2b365981de93ade81dad242a9026128931cdac56197415316f0e80264.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd455bb45_949a_444f_bf5f_61736fbe9c28.slice/crio-f44b471c515b8a23345147500d06bd1bde634dadead4dcd0277c040da9f87a25.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44b3c7de_e46e_45ce_aec3_090a38e347cf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17666473_50ea_48ef_afc8_265daec6df33.slice/crio-b0b99556c4447ad4ea24d77abc7a4a125cea9dff413425d71d0913f88307cd99.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50403f1a_ba57_4c1f_86a5_67269195ca65.slice/crio-803b394efea0353c6dcc4b5227756459a0f676ad763712060246045d990e8ad4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd455bb45_949a_444f_bf5f_61736fbe9c28.slice/crio-conmon-f44b471c515b8a23345147500d06bd1bde634dadead4dcd0277c040da9f87a25.scope\": RecentStats: unable to find data in memory cache]" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.317925 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69d47b8cc7-k9qjl" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.403877 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"e5f2b58e-2b8d-4efa-b37c-77717d271276","Type":"ContainerStarted","Data":"2e880f2f2ecafd54f5dab665e208e19bbcc8cabeda3a76db8d82c5f735d132b0"} Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.408417 4952 generic.go:334] "Generic (PLEG): container finished" podID="d455bb45-949a-444f-bf5f-61736fbe9c28" containerID="78fcf1edf83404e2d54eb637ecfc495cc2dfbfac93f2f7dd76ac810b4f72b4c6" exitCode=137 Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.408448 4952 generic.go:334] "Generic (PLEG): container finished" podID="d455bb45-949a-444f-bf5f-61736fbe9c28" containerID="f44b471c515b8a23345147500d06bd1bde634dadead4dcd0277c040da9f87a25" exitCode=137 Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.408482 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dccb8556f-cmgsg" event={"ID":"d455bb45-949a-444f-bf5f-61736fbe9c28","Type":"ContainerDied","Data":"78fcf1edf83404e2d54eb637ecfc495cc2dfbfac93f2f7dd76ac810b4f72b4c6"} Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.408508 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dccb8556f-cmgsg" event={"ID":"d455bb45-949a-444f-bf5f-61736fbe9c28","Type":"ContainerDied","Data":"f44b471c515b8a23345147500d06bd1bde634dadead4dcd0277c040da9f87a25"} Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.410394 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"7aa384b1-1586-4709-b258-def203cac8f5","Type":"ContainerStarted","Data":"403f23497628c8dcb0e51623514765c140b5e02073dc8291c056ef8ee00c2219"} Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.413069 4952 generic.go:334] "Generic (PLEG): container finished" podID="17666473-50ea-48ef-afc8-265daec6df33" containerID="b0b99556c4447ad4ea24d77abc7a4a125cea9dff413425d71d0913f88307cd99" exitCode=137 Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.413094 4952 generic.go:334] "Generic (PLEG): container finished" podID="17666473-50ea-48ef-afc8-265daec6df33" containerID="373d0be2b365981de93ade81dad242a9026128931cdac56197415316f0e80264" exitCode=137 Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.413109 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d47b8cc7-k9qjl" event={"ID":"17666473-50ea-48ef-afc8-265daec6df33","Type":"ContainerDied","Data":"b0b99556c4447ad4ea24d77abc7a4a125cea9dff413425d71d0913f88307cd99"} Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.413130 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d47b8cc7-k9qjl" event={"ID":"17666473-50ea-48ef-afc8-265daec6df33","Type":"ContainerDied","Data":"373d0be2b365981de93ade81dad242a9026128931cdac56197415316f0e80264"} Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.413146 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d47b8cc7-k9qjl" event={"ID":"17666473-50ea-48ef-afc8-265daec6df33","Type":"ContainerDied","Data":"9af805f778982576c093a678bd31163cd06118fa6114d6d8b610806b0e607241"} Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.413160 4952 scope.go:117] "RemoveContainer" containerID="b0b99556c4447ad4ea24d77abc7a4a125cea9dff413425d71d0913f88307cd99" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.413268 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69d47b8cc7-k9qjl" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.457508 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17666473-50ea-48ef-afc8-265daec6df33-scripts\") pod \"17666473-50ea-48ef-afc8-265daec6df33\" (UID: \"17666473-50ea-48ef-afc8-265daec6df33\") " Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.457583 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17666473-50ea-48ef-afc8-265daec6df33-logs\") pod \"17666473-50ea-48ef-afc8-265daec6df33\" (UID: \"17666473-50ea-48ef-afc8-265daec6df33\") " Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.457609 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17666473-50ea-48ef-afc8-265daec6df33-config-data\") pod \"17666473-50ea-48ef-afc8-265daec6df33\" (UID: \"17666473-50ea-48ef-afc8-265daec6df33\") " Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.457837 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hnsg\" (UniqueName: \"kubernetes.io/projected/17666473-50ea-48ef-afc8-265daec6df33-kube-api-access-5hnsg\") pod \"17666473-50ea-48ef-afc8-265daec6df33\" (UID: \"17666473-50ea-48ef-afc8-265daec6df33\") " Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.457883 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/17666473-50ea-48ef-afc8-265daec6df33-horizon-secret-key\") pod \"17666473-50ea-48ef-afc8-265daec6df33\" (UID: \"17666473-50ea-48ef-afc8-265daec6df33\") " Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.458983 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17666473-50ea-48ef-afc8-265daec6df33-logs" (OuterVolumeSpecName: "logs") pod "17666473-50ea-48ef-afc8-265daec6df33" (UID: "17666473-50ea-48ef-afc8-265daec6df33"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.465737 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17666473-50ea-48ef-afc8-265daec6df33-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "17666473-50ea-48ef-afc8-265daec6df33" (UID: "17666473-50ea-48ef-afc8-265daec6df33"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.489209 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17666473-50ea-48ef-afc8-265daec6df33-kube-api-access-5hnsg" (OuterVolumeSpecName: "kube-api-access-5hnsg") pod "17666473-50ea-48ef-afc8-265daec6df33" (UID: "17666473-50ea-48ef-afc8-265daec6df33"). InnerVolumeSpecName "kube-api-access-5hnsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.535202 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17666473-50ea-48ef-afc8-265daec6df33-scripts" (OuterVolumeSpecName: "scripts") pod "17666473-50ea-48ef-afc8-265daec6df33" (UID: "17666473-50ea-48ef-afc8-265daec6df33"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.548230 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17666473-50ea-48ef-afc8-265daec6df33-config-data" (OuterVolumeSpecName: "config-data") pod "17666473-50ea-48ef-afc8-265daec6df33" (UID: "17666473-50ea-48ef-afc8-265daec6df33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.559914 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hnsg\" (UniqueName: \"kubernetes.io/projected/17666473-50ea-48ef-afc8-265daec6df33-kube-api-access-5hnsg\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.559947 4952 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/17666473-50ea-48ef-afc8-265daec6df33-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.559957 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17666473-50ea-48ef-afc8-265daec6df33-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.559965 4952 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17666473-50ea-48ef-afc8-265daec6df33-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.559973 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17666473-50ea-48ef-afc8-265daec6df33-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.713654 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dccb8556f-cmgsg" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.777454 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-69d47b8cc7-k9qjl"] Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.788194 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-69d47b8cc7-k9qjl"] Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.871262 4952 scope.go:117] "RemoveContainer" containerID="373d0be2b365981de93ade81dad242a9026128931cdac56197415316f0e80264" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.877056 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d455bb45-949a-444f-bf5f-61736fbe9c28-config-data\") pod \"d455bb45-949a-444f-bf5f-61736fbe9c28\" (UID: \"d455bb45-949a-444f-bf5f-61736fbe9c28\") " Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.877107 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d455bb45-949a-444f-bf5f-61736fbe9c28-horizon-secret-key\") pod \"d455bb45-949a-444f-bf5f-61736fbe9c28\" (UID: \"d455bb45-949a-444f-bf5f-61736fbe9c28\") " Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.877202 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d455bb45-949a-444f-bf5f-61736fbe9c28-logs\") pod \"d455bb45-949a-444f-bf5f-61736fbe9c28\" (UID: \"d455bb45-949a-444f-bf5f-61736fbe9c28\") " Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.877308 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d455bb45-949a-444f-bf5f-61736fbe9c28-scripts\") pod \"d455bb45-949a-444f-bf5f-61736fbe9c28\" (UID: \"d455bb45-949a-444f-bf5f-61736fbe9c28\") " Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.877449 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvcmf\" (UniqueName: \"kubernetes.io/projected/d455bb45-949a-444f-bf5f-61736fbe9c28-kube-api-access-mvcmf\") pod \"d455bb45-949a-444f-bf5f-61736fbe9c28\" (UID: \"d455bb45-949a-444f-bf5f-61736fbe9c28\") " Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.878292 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d455bb45-949a-444f-bf5f-61736fbe9c28-logs" (OuterVolumeSpecName: "logs") pod "d455bb45-949a-444f-bf5f-61736fbe9c28" (UID: "d455bb45-949a-444f-bf5f-61736fbe9c28"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.881424 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d455bb45-949a-444f-bf5f-61736fbe9c28-kube-api-access-mvcmf" (OuterVolumeSpecName: "kube-api-access-mvcmf") pod "d455bb45-949a-444f-bf5f-61736fbe9c28" (UID: "d455bb45-949a-444f-bf5f-61736fbe9c28"). InnerVolumeSpecName "kube-api-access-mvcmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.882880 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d455bb45-949a-444f-bf5f-61736fbe9c28-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d455bb45-949a-444f-bf5f-61736fbe9c28" (UID: "d455bb45-949a-444f-bf5f-61736fbe9c28"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.903452 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d455bb45-949a-444f-bf5f-61736fbe9c28-scripts" (OuterVolumeSpecName: "scripts") pod "d455bb45-949a-444f-bf5f-61736fbe9c28" (UID: "d455bb45-949a-444f-bf5f-61736fbe9c28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.909750 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d455bb45-949a-444f-bf5f-61736fbe9c28-config-data" (OuterVolumeSpecName: "config-data") pod "d455bb45-949a-444f-bf5f-61736fbe9c28" (UID: "d455bb45-949a-444f-bf5f-61736fbe9c28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.926584 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.958935 4952 scope.go:117] "RemoveContainer" containerID="b0b99556c4447ad4ea24d77abc7a4a125cea9dff413425d71d0913f88307cd99" Nov 22 03:45:23 crc kubenswrapper[4952]: E1122 03:45:23.959918 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0b99556c4447ad4ea24d77abc7a4a125cea9dff413425d71d0913f88307cd99\": container with ID starting with b0b99556c4447ad4ea24d77abc7a4a125cea9dff413425d71d0913f88307cd99 not found: ID does not exist" containerID="b0b99556c4447ad4ea24d77abc7a4a125cea9dff413425d71d0913f88307cd99" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.959961 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0b99556c4447ad4ea24d77abc7a4a125cea9dff413425d71d0913f88307cd99"} err="failed to get container status \"b0b99556c4447ad4ea24d77abc7a4a125cea9dff413425d71d0913f88307cd99\": rpc error: code = NotFound desc = could not find container \"b0b99556c4447ad4ea24d77abc7a4a125cea9dff413425d71d0913f88307cd99\": container with ID starting with b0b99556c4447ad4ea24d77abc7a4a125cea9dff413425d71d0913f88307cd99 not found: ID does not exist" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.959985 4952 scope.go:117] "RemoveContainer" containerID="373d0be2b365981de93ade81dad242a9026128931cdac56197415316f0e80264" Nov 22 03:45:23 crc kubenswrapper[4952]: E1122 03:45:23.960304 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"373d0be2b365981de93ade81dad242a9026128931cdac56197415316f0e80264\": container with ID starting with 373d0be2b365981de93ade81dad242a9026128931cdac56197415316f0e80264 not found: ID does not exist" containerID="373d0be2b365981de93ade81dad242a9026128931cdac56197415316f0e80264" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.960350 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"373d0be2b365981de93ade81dad242a9026128931cdac56197415316f0e80264"} err="failed to get container status \"373d0be2b365981de93ade81dad242a9026128931cdac56197415316f0e80264\": rpc error: code = NotFound desc = could not find container \"373d0be2b365981de93ade81dad242a9026128931cdac56197415316f0e80264\": container with ID starting with 373d0be2b365981de93ade81dad242a9026128931cdac56197415316f0e80264 not found: ID does not exist" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.960376 4952 scope.go:117] "RemoveContainer" containerID="b0b99556c4447ad4ea24d77abc7a4a125cea9dff413425d71d0913f88307cd99" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.960718 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0b99556c4447ad4ea24d77abc7a4a125cea9dff413425d71d0913f88307cd99"} err="failed to get container status \"b0b99556c4447ad4ea24d77abc7a4a125cea9dff413425d71d0913f88307cd99\": rpc error: code = NotFound desc = could not find container \"b0b99556c4447ad4ea24d77abc7a4a125cea9dff413425d71d0913f88307cd99\": container with ID starting with b0b99556c4447ad4ea24d77abc7a4a125cea9dff413425d71d0913f88307cd99 not found: ID does not exist" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.960738 4952 scope.go:117] "RemoveContainer" containerID="373d0be2b365981de93ade81dad242a9026128931cdac56197415316f0e80264" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.961023 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"373d0be2b365981de93ade81dad242a9026128931cdac56197415316f0e80264"} err="failed to get container status \"373d0be2b365981de93ade81dad242a9026128931cdac56197415316f0e80264\": rpc error: code = NotFound desc = could not find container \"373d0be2b365981de93ade81dad242a9026128931cdac56197415316f0e80264\": container with ID starting with 373d0be2b365981de93ade81dad242a9026128931cdac56197415316f0e80264 not found: ID does not exist" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.979820 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d455bb45-949a-444f-bf5f-61736fbe9c28-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.979851 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvcmf\" (UniqueName: \"kubernetes.io/projected/d455bb45-949a-444f-bf5f-61736fbe9c28-kube-api-access-mvcmf\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.979860 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d455bb45-949a-444f-bf5f-61736fbe9c28-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.979871 4952 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d455bb45-949a-444f-bf5f-61736fbe9c28-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:23 crc kubenswrapper[4952]: I1122 03:45:23.979879 4952 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d455bb45-949a-444f-bf5f-61736fbe9c28-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:24 crc kubenswrapper[4952]: I1122 03:45:24.024234 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76b5fdb995-wzpbj" Nov 22 03:45:24 crc kubenswrapper[4952]: I1122 03:45:24.104807 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-sxwx6"] Nov 22 03:45:24 crc kubenswrapper[4952]: I1122 03:45:24.105246 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" podUID="f162dc08-bc15-4ec6-b9e7-857bcdfa0dff" containerName="dnsmasq-dns" containerID="cri-o://845c1f6384512c20d715342d78f2350b48ffadc195ad0e2f696d985321b7a12e" gracePeriod=10 Nov 22 03:45:24 crc kubenswrapper[4952]: I1122 03:45:24.428477 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"7aa384b1-1586-4709-b258-def203cac8f5","Type":"ContainerStarted","Data":"089f1956ebc5fa372eee620f1b782e39b43381d42d1751a339fabe14813c791f"} Nov 22 03:45:24 crc kubenswrapper[4952]: I1122 03:45:24.438328 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"e5f2b58e-2b8d-4efa-b37c-77717d271276","Type":"ContainerStarted","Data":"78a1a535205af5d0c1f9537b10a254ed24438fe170e8609e8b69b22f13afeb55"} Nov 22 03:45:24 crc kubenswrapper[4952]: I1122 03:45:24.443731 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dccb8556f-cmgsg" event={"ID":"d455bb45-949a-444f-bf5f-61736fbe9c28","Type":"ContainerDied","Data":"10f01e8070ab5448bfafcdf4e552c8cb82e83e9e8b37dad06243c72cd11c9284"} Nov 22 03:45:24 crc kubenswrapper[4952]: I1122 03:45:24.443781 4952 scope.go:117] "RemoveContainer" containerID="78fcf1edf83404e2d54eb637ecfc495cc2dfbfac93f2f7dd76ac810b4f72b4c6" Nov 22 03:45:24 crc kubenswrapper[4952]: I1122 03:45:24.443827 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dccb8556f-cmgsg" Nov 22 03:45:24 crc kubenswrapper[4952]: I1122 03:45:24.455241 4952 generic.go:334] "Generic (PLEG): container finished" podID="f162dc08-bc15-4ec6-b9e7-857bcdfa0dff" containerID="845c1f6384512c20d715342d78f2350b48ffadc195ad0e2f696d985321b7a12e" exitCode=0 Nov 22 03:45:24 crc kubenswrapper[4952]: I1122 03:45:24.455279 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" event={"ID":"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff","Type":"ContainerDied","Data":"845c1f6384512c20d715342d78f2350b48ffadc195ad0e2f696d985321b7a12e"} Nov 22 03:45:24 crc kubenswrapper[4952]: I1122 03:45:24.479419 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.945610018 podStartE2EDuration="11.479398459s" podCreationTimestamp="2025-11-22 03:45:13 +0000 UTC" firstStartedPulling="2025-11-22 03:45:14.948885146 +0000 UTC m=+3079.254902429" lastFinishedPulling="2025-11-22 03:45:22.482673587 +0000 UTC m=+3086.788690870" observedRunningTime="2025-11-22 03:45:24.467909485 +0000 UTC m=+3088.773926758" watchObservedRunningTime="2025-11-22 03:45:24.479398459 +0000 UTC m=+3088.785415752" Nov 22 03:45:24 crc kubenswrapper[4952]: I1122 03:45:24.503013 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-dccb8556f-cmgsg"] Nov 22 03:45:24 crc kubenswrapper[4952]: I1122 03:45:24.511867 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-dccb8556f-cmgsg"] Nov 22 03:45:24 crc kubenswrapper[4952]: I1122 03:45:24.553803 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17666473-50ea-48ef-afc8-265daec6df33" path="/var/lib/kubelet/pods/17666473-50ea-48ef-afc8-265daec6df33/volumes" Nov 22 03:45:24 crc kubenswrapper[4952]: I1122 03:45:24.554499 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d455bb45-949a-444f-bf5f-61736fbe9c28" path="/var/lib/kubelet/pods/d455bb45-949a-444f-bf5f-61736fbe9c28/volumes" Nov 22 03:45:24 crc kubenswrapper[4952]: I1122 03:45:24.674468 4952 scope.go:117] "RemoveContainer" containerID="f44b471c515b8a23345147500d06bd1bde634dadead4dcd0277c040da9f87a25" Nov 22 03:45:24 crc kubenswrapper[4952]: I1122 03:45:24.846154 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.019398 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-config\") pod \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.019592 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-ovsdbserver-nb\") pod \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.019648 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-ovsdbserver-sb\") pod \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.019716 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-openstack-edpm-ipam\") pod \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.019766 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-dns-svc\") pod \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.019794 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6xdc\" (UniqueName: \"kubernetes.io/projected/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-kube-api-access-n6xdc\") pod \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\" (UID: \"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff\") " Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.050800 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-kube-api-access-n6xdc" (OuterVolumeSpecName: "kube-api-access-n6xdc") pod "f162dc08-bc15-4ec6-b9e7-857bcdfa0dff" (UID: "f162dc08-bc15-4ec6-b9e7-857bcdfa0dff"). InnerVolumeSpecName "kube-api-access-n6xdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.081979 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-config" (OuterVolumeSpecName: "config") pod "f162dc08-bc15-4ec6-b9e7-857bcdfa0dff" (UID: "f162dc08-bc15-4ec6-b9e7-857bcdfa0dff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.083235 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f162dc08-bc15-4ec6-b9e7-857bcdfa0dff" (UID: "f162dc08-bc15-4ec6-b9e7-857bcdfa0dff"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.087453 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f162dc08-bc15-4ec6-b9e7-857bcdfa0dff" (UID: "f162dc08-bc15-4ec6-b9e7-857bcdfa0dff"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.096956 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "f162dc08-bc15-4ec6-b9e7-857bcdfa0dff" (UID: "f162dc08-bc15-4ec6-b9e7-857bcdfa0dff"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.101044 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f162dc08-bc15-4ec6-b9e7-857bcdfa0dff" (UID: "f162dc08-bc15-4ec6-b9e7-857bcdfa0dff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.122239 4952 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.122267 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.122277 4952 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.122286 4952 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.122295 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6xdc\" (UniqueName: \"kubernetes.io/projected/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-kube-api-access-n6xdc\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.122303 4952 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.158234 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.327488 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-ceilometer-tls-certs\") pod \"50403f1a-ba57-4c1f-86a5-67269195ca65\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.327739 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qdvb\" (UniqueName: \"kubernetes.io/projected/50403f1a-ba57-4c1f-86a5-67269195ca65-kube-api-access-6qdvb\") pod \"50403f1a-ba57-4c1f-86a5-67269195ca65\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.327780 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-sg-core-conf-yaml\") pod \"50403f1a-ba57-4c1f-86a5-67269195ca65\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.327801 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-scripts\") pod \"50403f1a-ba57-4c1f-86a5-67269195ca65\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.327825 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-config-data\") pod \"50403f1a-ba57-4c1f-86a5-67269195ca65\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.327877 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-combined-ca-bundle\") pod \"50403f1a-ba57-4c1f-86a5-67269195ca65\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.327919 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50403f1a-ba57-4c1f-86a5-67269195ca65-run-httpd\") pod \"50403f1a-ba57-4c1f-86a5-67269195ca65\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.327939 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50403f1a-ba57-4c1f-86a5-67269195ca65-log-httpd\") pod \"50403f1a-ba57-4c1f-86a5-67269195ca65\" (UID: \"50403f1a-ba57-4c1f-86a5-67269195ca65\") " Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.328682 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50403f1a-ba57-4c1f-86a5-67269195ca65-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "50403f1a-ba57-4c1f-86a5-67269195ca65" (UID: "50403f1a-ba57-4c1f-86a5-67269195ca65"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.328875 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50403f1a-ba57-4c1f-86a5-67269195ca65-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "50403f1a-ba57-4c1f-86a5-67269195ca65" (UID: "50403f1a-ba57-4c1f-86a5-67269195ca65"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.335248 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-scripts" (OuterVolumeSpecName: "scripts") pod "50403f1a-ba57-4c1f-86a5-67269195ca65" (UID: "50403f1a-ba57-4c1f-86a5-67269195ca65"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.335427 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50403f1a-ba57-4c1f-86a5-67269195ca65-kube-api-access-6qdvb" (OuterVolumeSpecName: "kube-api-access-6qdvb") pod "50403f1a-ba57-4c1f-86a5-67269195ca65" (UID: "50403f1a-ba57-4c1f-86a5-67269195ca65"). InnerVolumeSpecName "kube-api-access-6qdvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.365701 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "50403f1a-ba57-4c1f-86a5-67269195ca65" (UID: "50403f1a-ba57-4c1f-86a5-67269195ca65"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.385564 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "50403f1a-ba57-4c1f-86a5-67269195ca65" (UID: "50403f1a-ba57-4c1f-86a5-67269195ca65"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.407893 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50403f1a-ba57-4c1f-86a5-67269195ca65" (UID: "50403f1a-ba57-4c1f-86a5-67269195ca65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.430218 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qdvb\" (UniqueName: \"kubernetes.io/projected/50403f1a-ba57-4c1f-86a5-67269195ca65-kube-api-access-6qdvb\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.430250 4952 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.430260 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.430270 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.430278 4952 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50403f1a-ba57-4c1f-86a5-67269195ca65-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.430286 4952 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50403f1a-ba57-4c1f-86a5-67269195ca65-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.430293 4952 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.431182 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-config-data" (OuterVolumeSpecName: "config-data") pod "50403f1a-ba57-4c1f-86a5-67269195ca65" (UID: "50403f1a-ba57-4c1f-86a5-67269195ca65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.466122 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" event={"ID":"f162dc08-bc15-4ec6-b9e7-857bcdfa0dff","Type":"ContainerDied","Data":"a04f591f581b8a75cc570586b876ea0865c4a368fbd63491cb7fdcfa4e46408d"} Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.466172 4952 scope.go:117] "RemoveContainer" containerID="845c1f6384512c20d715342d78f2350b48ffadc195ad0e2f696d985321b7a12e" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.466171 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-sxwx6" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.468957 4952 generic.go:334] "Generic (PLEG): container finished" podID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerID="fadc909b8429c89e3bd89b21d26568ab55aaa449f6180242f3547974616e3131" exitCode=0 Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.469009 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50403f1a-ba57-4c1f-86a5-67269195ca65","Type":"ContainerDied","Data":"fadc909b8429c89e3bd89b21d26568ab55aaa449f6180242f3547974616e3131"} Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.469049 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50403f1a-ba57-4c1f-86a5-67269195ca65","Type":"ContainerDied","Data":"360316b03b4cdbb0ad047efda942b4766bb45aa6b6e50c5f803542cf827b86a4"} Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.468992 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.471658 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"7aa384b1-1586-4709-b258-def203cac8f5","Type":"ContainerStarted","Data":"865195ff68bf012a52b973f09182eeda5a10fc48b6fba6ad606be0bec508b5fa"} Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.503367 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=6.503350163 podStartE2EDuration="6.503350163s" podCreationTimestamp="2025-11-22 03:45:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:45:25.490219405 +0000 UTC m=+3089.796236688" watchObservedRunningTime="2025-11-22 03:45:25.503350163 +0000 UTC m=+3089.809367436" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.508814 4952 scope.go:117] "RemoveContainer" containerID="f6fec01a09c157266c62d697df5140359a4d04bfac527f5200614f09f7ca37f2" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.532321 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50403f1a-ba57-4c1f-86a5-67269195ca65-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.533419 4952 scope.go:117] "RemoveContainer" containerID="c0b04bebd9ed531aec28fd430adc6d44e23000a227a75e35465b6c1c4c47b4a9" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.534764 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-sxwx6"] Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.549582 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-sxwx6"] Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.565186 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.578775 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.580187 4952 scope.go:117] "RemoveContainer" containerID="803b394efea0353c6dcc4b5227756459a0f676ad763712060246045d990e8ad4" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.588773 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:45:25 crc kubenswrapper[4952]: E1122 03:45:25.589316 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d455bb45-949a-444f-bf5f-61736fbe9c28" containerName="horizon" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.589332 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="d455bb45-949a-444f-bf5f-61736fbe9c28" containerName="horizon" Nov 22 03:45:25 crc kubenswrapper[4952]: E1122 03:45:25.589348 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerName="proxy-httpd" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.589354 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerName="proxy-httpd" Nov 22 03:45:25 crc kubenswrapper[4952]: E1122 03:45:25.589364 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerName="ceilometer-notification-agent" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.589371 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerName="ceilometer-notification-agent" Nov 22 03:45:25 crc kubenswrapper[4952]: E1122 03:45:25.589384 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerName="sg-core" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.589390 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerName="sg-core" Nov 22 03:45:25 crc kubenswrapper[4952]: E1122 03:45:25.589409 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d455bb45-949a-444f-bf5f-61736fbe9c28" containerName="horizon-log" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.589414 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="d455bb45-949a-444f-bf5f-61736fbe9c28" containerName="horizon-log" Nov 22 03:45:25 crc kubenswrapper[4952]: E1122 03:45:25.589426 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f162dc08-bc15-4ec6-b9e7-857bcdfa0dff" containerName="init" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.589433 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="f162dc08-bc15-4ec6-b9e7-857bcdfa0dff" containerName="init" Nov 22 03:45:25 crc kubenswrapper[4952]: E1122 03:45:25.589443 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17666473-50ea-48ef-afc8-265daec6df33" containerName="horizon" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.589452 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="17666473-50ea-48ef-afc8-265daec6df33" containerName="horizon" Nov 22 03:45:25 crc kubenswrapper[4952]: E1122 03:45:25.589468 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17666473-50ea-48ef-afc8-265daec6df33" containerName="horizon-log" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.589473 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="17666473-50ea-48ef-afc8-265daec6df33" containerName="horizon-log" Nov 22 03:45:25 crc kubenswrapper[4952]: E1122 03:45:25.589487 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerName="ceilometer-central-agent" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.589492 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerName="ceilometer-central-agent" Nov 22 03:45:25 crc kubenswrapper[4952]: E1122 03:45:25.589514 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f162dc08-bc15-4ec6-b9e7-857bcdfa0dff" containerName="dnsmasq-dns" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.589520 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="f162dc08-bc15-4ec6-b9e7-857bcdfa0dff" containerName="dnsmasq-dns" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.589818 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerName="ceilometer-notification-agent" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.589843 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerName="proxy-httpd" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.589851 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="17666473-50ea-48ef-afc8-265daec6df33" containerName="horizon-log" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.589860 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerName="sg-core" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.589872 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="d455bb45-949a-444f-bf5f-61736fbe9c28" containerName="horizon" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.589880 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="f162dc08-bc15-4ec6-b9e7-857bcdfa0dff" containerName="dnsmasq-dns" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.589888 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="50403f1a-ba57-4c1f-86a5-67269195ca65" containerName="ceilometer-central-agent" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.589895 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="d455bb45-949a-444f-bf5f-61736fbe9c28" containerName="horizon-log" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.589904 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="17666473-50ea-48ef-afc8-265daec6df33" containerName="horizon" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.592110 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.594024 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.594350 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.595186 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.605693 4952 scope.go:117] "RemoveContainer" containerID="fadc909b8429c89e3bd89b21d26568ab55aaa449f6180242f3547974616e3131" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.617926 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.631539 4952 scope.go:117] "RemoveContainer" containerID="ffcb9e1580e370ee0cb31f4463f1da04ab31cfd53007ae164da8e3026ec9e621" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.663037 4952 scope.go:117] "RemoveContainer" containerID="c0b04bebd9ed531aec28fd430adc6d44e23000a227a75e35465b6c1c4c47b4a9" Nov 22 03:45:25 crc kubenswrapper[4952]: E1122 03:45:25.663471 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0b04bebd9ed531aec28fd430adc6d44e23000a227a75e35465b6c1c4c47b4a9\": container with ID starting with c0b04bebd9ed531aec28fd430adc6d44e23000a227a75e35465b6c1c4c47b4a9 not found: ID does not exist" containerID="c0b04bebd9ed531aec28fd430adc6d44e23000a227a75e35465b6c1c4c47b4a9" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.663504 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0b04bebd9ed531aec28fd430adc6d44e23000a227a75e35465b6c1c4c47b4a9"} err="failed to get container status \"c0b04bebd9ed531aec28fd430adc6d44e23000a227a75e35465b6c1c4c47b4a9\": rpc error: code = NotFound desc = could not find container \"c0b04bebd9ed531aec28fd430adc6d44e23000a227a75e35465b6c1c4c47b4a9\": container with ID starting with c0b04bebd9ed531aec28fd430adc6d44e23000a227a75e35465b6c1c4c47b4a9 not found: ID does not exist" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.663526 4952 scope.go:117] "RemoveContainer" containerID="803b394efea0353c6dcc4b5227756459a0f676ad763712060246045d990e8ad4" Nov 22 03:45:25 crc kubenswrapper[4952]: E1122 03:45:25.664011 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"803b394efea0353c6dcc4b5227756459a0f676ad763712060246045d990e8ad4\": container with ID starting with 803b394efea0353c6dcc4b5227756459a0f676ad763712060246045d990e8ad4 not found: ID does not exist" containerID="803b394efea0353c6dcc4b5227756459a0f676ad763712060246045d990e8ad4" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.664133 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"803b394efea0353c6dcc4b5227756459a0f676ad763712060246045d990e8ad4"} err="failed to get container status \"803b394efea0353c6dcc4b5227756459a0f676ad763712060246045d990e8ad4\": rpc error: code = NotFound desc = could not find container \"803b394efea0353c6dcc4b5227756459a0f676ad763712060246045d990e8ad4\": container with ID starting with 803b394efea0353c6dcc4b5227756459a0f676ad763712060246045d990e8ad4 not found: ID does not exist" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.664252 4952 scope.go:117] "RemoveContainer" containerID="fadc909b8429c89e3bd89b21d26568ab55aaa449f6180242f3547974616e3131" Nov 22 03:45:25 crc kubenswrapper[4952]: E1122 03:45:25.664790 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fadc909b8429c89e3bd89b21d26568ab55aaa449f6180242f3547974616e3131\": container with ID starting with fadc909b8429c89e3bd89b21d26568ab55aaa449f6180242f3547974616e3131 not found: ID does not exist" containerID="fadc909b8429c89e3bd89b21d26568ab55aaa449f6180242f3547974616e3131" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.664877 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fadc909b8429c89e3bd89b21d26568ab55aaa449f6180242f3547974616e3131"} err="failed to get container status \"fadc909b8429c89e3bd89b21d26568ab55aaa449f6180242f3547974616e3131\": rpc error: code = NotFound desc = could not find container \"fadc909b8429c89e3bd89b21d26568ab55aaa449f6180242f3547974616e3131\": container with ID starting with fadc909b8429c89e3bd89b21d26568ab55aaa449f6180242f3547974616e3131 not found: ID does not exist" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.664959 4952 scope.go:117] "RemoveContainer" containerID="ffcb9e1580e370ee0cb31f4463f1da04ab31cfd53007ae164da8e3026ec9e621" Nov 22 03:45:25 crc kubenswrapper[4952]: E1122 03:45:25.665331 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffcb9e1580e370ee0cb31f4463f1da04ab31cfd53007ae164da8e3026ec9e621\": container with ID starting with ffcb9e1580e370ee0cb31f4463f1da04ab31cfd53007ae164da8e3026ec9e621 not found: ID does not exist" containerID="ffcb9e1580e370ee0cb31f4463f1da04ab31cfd53007ae164da8e3026ec9e621" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.665465 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffcb9e1580e370ee0cb31f4463f1da04ab31cfd53007ae164da8e3026ec9e621"} err="failed to get container status \"ffcb9e1580e370ee0cb31f4463f1da04ab31cfd53007ae164da8e3026ec9e621\": rpc error: code = NotFound desc = could not find container \"ffcb9e1580e370ee0cb31f4463f1da04ab31cfd53007ae164da8e3026ec9e621\": container with ID starting with ffcb9e1580e370ee0cb31f4463f1da04ab31cfd53007ae164da8e3026ec9e621 not found: ID does not exist" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.738639 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/290ea525-330c-4eb2-ab44-e1fb7aa90384-log-httpd\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.738723 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.738863 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-scripts\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.738943 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr2zw\" (UniqueName: \"kubernetes.io/projected/290ea525-330c-4eb2-ab44-e1fb7aa90384-kube-api-access-mr2zw\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.738995 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/290ea525-330c-4eb2-ab44-e1fb7aa90384-run-httpd\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.739024 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.739063 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.739132 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-config-data\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.841426 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.841482 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.841534 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-config-data\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.841586 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/290ea525-330c-4eb2-ab44-e1fb7aa90384-log-httpd\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.841619 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.841682 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-scripts\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.841721 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr2zw\" (UniqueName: \"kubernetes.io/projected/290ea525-330c-4eb2-ab44-e1fb7aa90384-kube-api-access-mr2zw\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.841750 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/290ea525-330c-4eb2-ab44-e1fb7aa90384-run-httpd\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.842197 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/290ea525-330c-4eb2-ab44-e1fb7aa90384-run-httpd\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.842252 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/290ea525-330c-4eb2-ab44-e1fb7aa90384-log-httpd\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.845455 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.845524 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.845888 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.845986 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-scripts\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.854352 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-config-data\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.858455 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr2zw\" (UniqueName: \"kubernetes.io/projected/290ea525-330c-4eb2-ab44-e1fb7aa90384-kube-api-access-mr2zw\") pod \"ceilometer-0\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " pod="openstack/ceilometer-0" Nov 22 03:45:25 crc kubenswrapper[4952]: I1122 03:45:25.915357 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:45:26 crc kubenswrapper[4952]: I1122 03:45:26.378213 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:45:26 crc kubenswrapper[4952]: W1122 03:45:26.380232 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod290ea525_330c_4eb2_ab44_e1fb7aa90384.slice/crio-4a98c9daa1a604def22634c7820e094893dfb6335fe8e764556e88573fe1116d WatchSource:0}: Error finding container 4a98c9daa1a604def22634c7820e094893dfb6335fe8e764556e88573fe1116d: Status 404 returned error can't find the container with id 4a98c9daa1a604def22634c7820e094893dfb6335fe8e764556e88573fe1116d Nov 22 03:45:26 crc kubenswrapper[4952]: I1122 03:45:26.484999 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"290ea525-330c-4eb2-ab44-e1fb7aa90384","Type":"ContainerStarted","Data":"4a98c9daa1a604def22634c7820e094893dfb6335fe8e764556e88573fe1116d"} Nov 22 03:45:26 crc kubenswrapper[4952]: I1122 03:45:26.485047 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Nov 22 03:45:26 crc kubenswrapper[4952]: I1122 03:45:26.543060 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50403f1a-ba57-4c1f-86a5-67269195ca65" path="/var/lib/kubelet/pods/50403f1a-ba57-4c1f-86a5-67269195ca65/volumes" Nov 22 03:45:26 crc kubenswrapper[4952]: I1122 03:45:26.544098 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f162dc08-bc15-4ec6-b9e7-857bcdfa0dff" path="/var/lib/kubelet/pods/f162dc08-bc15-4ec6-b9e7-857bcdfa0dff/volumes" Nov 22 03:45:27 crc kubenswrapper[4952]: I1122 03:45:27.527514 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"290ea525-330c-4eb2-ab44-e1fb7aa90384","Type":"ContainerStarted","Data":"0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd"} Nov 22 03:45:28 crc kubenswrapper[4952]: I1122 03:45:28.259100 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:45:28 crc kubenswrapper[4952]: I1122 03:45:28.555902 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"290ea525-330c-4eb2-ab44-e1fb7aa90384","Type":"ContainerStarted","Data":"b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311"} Nov 22 03:45:29 crc kubenswrapper[4952]: I1122 03:45:29.558066 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"290ea525-330c-4eb2-ab44-e1fb7aa90384","Type":"ContainerStarted","Data":"4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305"} Nov 22 03:45:31 crc kubenswrapper[4952]: I1122 03:45:31.699369 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5675778f5b-wg7px" podUID="d8eb2643-4d7c-4814-91fe-1192d3fc753d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.240:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.240:8443: connect: connection refused" Nov 22 03:45:32 crc kubenswrapper[4952]: I1122 03:45:32.592729 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"290ea525-330c-4eb2-ab44-e1fb7aa90384","Type":"ContainerStarted","Data":"7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9"} Nov 22 03:45:32 crc kubenswrapper[4952]: I1122 03:45:32.592964 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="290ea525-330c-4eb2-ab44-e1fb7aa90384" containerName="proxy-httpd" containerID="cri-o://7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9" gracePeriod=30 Nov 22 03:45:32 crc kubenswrapper[4952]: I1122 03:45:32.592979 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="290ea525-330c-4eb2-ab44-e1fb7aa90384" containerName="ceilometer-central-agent" containerID="cri-o://0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd" gracePeriod=30 Nov 22 03:45:32 crc kubenswrapper[4952]: I1122 03:45:32.593026 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="290ea525-330c-4eb2-ab44-e1fb7aa90384" containerName="sg-core" containerID="cri-o://4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305" gracePeriod=30 Nov 22 03:45:32 crc kubenswrapper[4952]: I1122 03:45:32.593051 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="290ea525-330c-4eb2-ab44-e1fb7aa90384" containerName="ceilometer-notification-agent" containerID="cri-o://b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311" gracePeriod=30 Nov 22 03:45:32 crc kubenswrapper[4952]: I1122 03:45:32.595534 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 03:45:32 crc kubenswrapper[4952]: I1122 03:45:32.642134 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.608699276 podStartE2EDuration="7.642114039s" podCreationTimestamp="2025-11-22 03:45:25 +0000 UTC" firstStartedPulling="2025-11-22 03:45:26.382856377 +0000 UTC m=+3090.688873650" lastFinishedPulling="2025-11-22 03:45:31.41627111 +0000 UTC m=+3095.722288413" observedRunningTime="2025-11-22 03:45:32.637969599 +0000 UTC m=+3096.943986882" watchObservedRunningTime="2025-11-22 03:45:32.642114039 +0000 UTC m=+3096.948131312" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.365724 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.531378 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:45:33 crc kubenswrapper[4952]: E1122 03:45:33.531793 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.538924 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr2zw\" (UniqueName: \"kubernetes.io/projected/290ea525-330c-4eb2-ab44-e1fb7aa90384-kube-api-access-mr2zw\") pod \"290ea525-330c-4eb2-ab44-e1fb7aa90384\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.539063 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-scripts\") pod \"290ea525-330c-4eb2-ab44-e1fb7aa90384\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.539140 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/290ea525-330c-4eb2-ab44-e1fb7aa90384-run-httpd\") pod \"290ea525-330c-4eb2-ab44-e1fb7aa90384\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.539211 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-sg-core-conf-yaml\") pod \"290ea525-330c-4eb2-ab44-e1fb7aa90384\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.539266 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-ceilometer-tls-certs\") pod \"290ea525-330c-4eb2-ab44-e1fb7aa90384\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.539305 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-config-data\") pod \"290ea525-330c-4eb2-ab44-e1fb7aa90384\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.539496 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/290ea525-330c-4eb2-ab44-e1fb7aa90384-log-httpd\") pod \"290ea525-330c-4eb2-ab44-e1fb7aa90384\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.539572 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-combined-ca-bundle\") pod \"290ea525-330c-4eb2-ab44-e1fb7aa90384\" (UID: \"290ea525-330c-4eb2-ab44-e1fb7aa90384\") " Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.539818 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/290ea525-330c-4eb2-ab44-e1fb7aa90384-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "290ea525-330c-4eb2-ab44-e1fb7aa90384" (UID: "290ea525-330c-4eb2-ab44-e1fb7aa90384"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.539894 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/290ea525-330c-4eb2-ab44-e1fb7aa90384-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "290ea525-330c-4eb2-ab44-e1fb7aa90384" (UID: "290ea525-330c-4eb2-ab44-e1fb7aa90384"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.540410 4952 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/290ea525-330c-4eb2-ab44-e1fb7aa90384-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.540435 4952 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/290ea525-330c-4eb2-ab44-e1fb7aa90384-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.544531 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-scripts" (OuterVolumeSpecName: "scripts") pod "290ea525-330c-4eb2-ab44-e1fb7aa90384" (UID: "290ea525-330c-4eb2-ab44-e1fb7aa90384"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.546181 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/290ea525-330c-4eb2-ab44-e1fb7aa90384-kube-api-access-mr2zw" (OuterVolumeSpecName: "kube-api-access-mr2zw") pod "290ea525-330c-4eb2-ab44-e1fb7aa90384" (UID: "290ea525-330c-4eb2-ab44-e1fb7aa90384"). InnerVolumeSpecName "kube-api-access-mr2zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.567711 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "290ea525-330c-4eb2-ab44-e1fb7aa90384" (UID: "290ea525-330c-4eb2-ab44-e1fb7aa90384"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.588037 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "290ea525-330c-4eb2-ab44-e1fb7aa90384" (UID: "290ea525-330c-4eb2-ab44-e1fb7aa90384"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.613635 4952 generic.go:334] "Generic (PLEG): container finished" podID="290ea525-330c-4eb2-ab44-e1fb7aa90384" containerID="7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9" exitCode=0 Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.613690 4952 generic.go:334] "Generic (PLEG): container finished" podID="290ea525-330c-4eb2-ab44-e1fb7aa90384" containerID="4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305" exitCode=2 Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.613699 4952 generic.go:334] "Generic (PLEG): container finished" podID="290ea525-330c-4eb2-ab44-e1fb7aa90384" containerID="b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311" exitCode=0 Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.613706 4952 generic.go:334] "Generic (PLEG): container finished" podID="290ea525-330c-4eb2-ab44-e1fb7aa90384" containerID="0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd" exitCode=0 Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.613713 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.613726 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"290ea525-330c-4eb2-ab44-e1fb7aa90384","Type":"ContainerDied","Data":"7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9"} Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.613752 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"290ea525-330c-4eb2-ab44-e1fb7aa90384","Type":"ContainerDied","Data":"4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305"} Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.613762 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"290ea525-330c-4eb2-ab44-e1fb7aa90384","Type":"ContainerDied","Data":"b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311"} Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.613771 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"290ea525-330c-4eb2-ab44-e1fb7aa90384","Type":"ContainerDied","Data":"0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd"} Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.613779 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"290ea525-330c-4eb2-ab44-e1fb7aa90384","Type":"ContainerDied","Data":"4a98c9daa1a604def22634c7820e094893dfb6335fe8e764556e88573fe1116d"} Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.613794 4952 scope.go:117] "RemoveContainer" containerID="7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.615822 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "290ea525-330c-4eb2-ab44-e1fb7aa90384" (UID: "290ea525-330c-4eb2-ab44-e1fb7aa90384"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.636677 4952 scope.go:117] "RemoveContainer" containerID="4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.642994 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.643040 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr2zw\" (UniqueName: \"kubernetes.io/projected/290ea525-330c-4eb2-ab44-e1fb7aa90384-kube-api-access-mr2zw\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.643059 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.643075 4952 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.643146 4952 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.652731 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-config-data" (OuterVolumeSpecName: "config-data") pod "290ea525-330c-4eb2-ab44-e1fb7aa90384" (UID: "290ea525-330c-4eb2-ab44-e1fb7aa90384"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.656297 4952 scope.go:117] "RemoveContainer" containerID="b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.676076 4952 scope.go:117] "RemoveContainer" containerID="0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.698882 4952 scope.go:117] "RemoveContainer" containerID="7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9" Nov 22 03:45:33 crc kubenswrapper[4952]: E1122 03:45:33.699328 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9\": container with ID starting with 7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9 not found: ID does not exist" containerID="7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.699377 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9"} err="failed to get container status \"7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9\": rpc error: code = NotFound desc = could not find container \"7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9\": container with ID starting with 7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9 not found: ID does not exist" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.699410 4952 scope.go:117] "RemoveContainer" containerID="4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305" Nov 22 03:45:33 crc kubenswrapper[4952]: E1122 03:45:33.699882 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305\": container with ID starting with 4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305 not found: ID does not exist" containerID="4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.699920 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305"} err="failed to get container status \"4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305\": rpc error: code = NotFound desc = could not find container \"4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305\": container with ID starting with 4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305 not found: ID does not exist" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.699948 4952 scope.go:117] "RemoveContainer" containerID="b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311" Nov 22 03:45:33 crc kubenswrapper[4952]: E1122 03:45:33.700413 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311\": container with ID starting with b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311 not found: ID does not exist" containerID="b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.700450 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311"} err="failed to get container status \"b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311\": rpc error: code = NotFound desc = could not find container \"b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311\": container with ID starting with b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311 not found: ID does not exist" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.700472 4952 scope.go:117] "RemoveContainer" containerID="0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd" Nov 22 03:45:33 crc kubenswrapper[4952]: E1122 03:45:33.700695 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd\": container with ID starting with 0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd not found: ID does not exist" containerID="0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.700718 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd"} err="failed to get container status \"0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd\": rpc error: code = NotFound desc = could not find container \"0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd\": container with ID starting with 0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd not found: ID does not exist" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.700731 4952 scope.go:117] "RemoveContainer" containerID="7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.700939 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9"} err="failed to get container status \"7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9\": rpc error: code = NotFound desc = could not find container \"7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9\": container with ID starting with 7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9 not found: ID does not exist" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.700972 4952 scope.go:117] "RemoveContainer" containerID="4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.701186 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305"} err="failed to get container status \"4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305\": rpc error: code = NotFound desc = could not find container \"4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305\": container with ID starting with 4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305 not found: ID does not exist" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.701212 4952 scope.go:117] "RemoveContainer" containerID="b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.701454 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311"} err="failed to get container status \"b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311\": rpc error: code = NotFound desc = could not find container \"b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311\": container with ID starting with b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311 not found: ID does not exist" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.701478 4952 scope.go:117] "RemoveContainer" containerID="0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.701801 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd"} err="failed to get container status \"0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd\": rpc error: code = NotFound desc = could not find container \"0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd\": container with ID starting with 0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd not found: ID does not exist" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.701832 4952 scope.go:117] "RemoveContainer" containerID="7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.702021 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9"} err="failed to get container status \"7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9\": rpc error: code = NotFound desc = could not find container \"7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9\": container with ID starting with 7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9 not found: ID does not exist" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.702042 4952 scope.go:117] "RemoveContainer" containerID="4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.702227 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305"} err="failed to get container status \"4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305\": rpc error: code = NotFound desc = could not find container \"4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305\": container with ID starting with 4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305 not found: ID does not exist" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.702251 4952 scope.go:117] "RemoveContainer" containerID="b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.702466 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311"} err="failed to get container status \"b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311\": rpc error: code = NotFound desc = could not find container \"b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311\": container with ID starting with b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311 not found: ID does not exist" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.702500 4952 scope.go:117] "RemoveContainer" containerID="0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.702788 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd"} err="failed to get container status \"0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd\": rpc error: code = NotFound desc = could not find container \"0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd\": container with ID starting with 0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd not found: ID does not exist" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.702810 4952 scope.go:117] "RemoveContainer" containerID="7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.703080 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9"} err="failed to get container status \"7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9\": rpc error: code = NotFound desc = could not find container \"7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9\": container with ID starting with 7bd36d13972f26d5d8ab143ff82a5c95912eff84134e1eb862f86947f28103a9 not found: ID does not exist" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.703105 4952 scope.go:117] "RemoveContainer" containerID="4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.703315 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305"} err="failed to get container status \"4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305\": rpc error: code = NotFound desc = could not find container \"4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305\": container with ID starting with 4087618d14d2f9736d640bec32a3fbccb3e1f082617108e7a93dd80ed78ff305 not found: ID does not exist" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.703342 4952 scope.go:117] "RemoveContainer" containerID="b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.703592 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311"} err="failed to get container status \"b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311\": rpc error: code = NotFound desc = could not find container \"b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311\": container with ID starting with b7125b04c6736000138cf2735d1428285237751bd52ce76fcf6d76ec97013311 not found: ID does not exist" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.703614 4952 scope.go:117] "RemoveContainer" containerID="0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.703853 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd"} err="failed to get container status \"0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd\": rpc error: code = NotFound desc = could not find container \"0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd\": container with ID starting with 0aa76184aa011fa8466fc49dd517cb13d932cbdc105b743127799438670a25fd not found: ID does not exist" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.744978 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/290ea525-330c-4eb2-ab44-e1fb7aa90384-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.958159 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.980871 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.982381 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:45:33 crc kubenswrapper[4952]: I1122 03:45:33.999763 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:45:34 crc kubenswrapper[4952]: E1122 03:45:34.000346 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290ea525-330c-4eb2-ab44-e1fb7aa90384" containerName="sg-core" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.000368 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="290ea525-330c-4eb2-ab44-e1fb7aa90384" containerName="sg-core" Nov 22 03:45:34 crc kubenswrapper[4952]: E1122 03:45:34.000431 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290ea525-330c-4eb2-ab44-e1fb7aa90384" containerName="proxy-httpd" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.000445 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="290ea525-330c-4eb2-ab44-e1fb7aa90384" containerName="proxy-httpd" Nov 22 03:45:34 crc kubenswrapper[4952]: E1122 03:45:34.000467 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290ea525-330c-4eb2-ab44-e1fb7aa90384" containerName="ceilometer-central-agent" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.000478 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="290ea525-330c-4eb2-ab44-e1fb7aa90384" containerName="ceilometer-central-agent" Nov 22 03:45:34 crc kubenswrapper[4952]: E1122 03:45:34.000494 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290ea525-330c-4eb2-ab44-e1fb7aa90384" containerName="ceilometer-notification-agent" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.000502 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="290ea525-330c-4eb2-ab44-e1fb7aa90384" containerName="ceilometer-notification-agent" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.000775 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="290ea525-330c-4eb2-ab44-e1fb7aa90384" containerName="ceilometer-notification-agent" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.000795 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="290ea525-330c-4eb2-ab44-e1fb7aa90384" containerName="proxy-httpd" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.000805 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="290ea525-330c-4eb2-ab44-e1fb7aa90384" containerName="ceilometer-central-agent" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.000831 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="290ea525-330c-4eb2-ab44-e1fb7aa90384" containerName="sg-core" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.003154 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.007404 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.012414 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.012606 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.017840 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.051906 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba66b462-c52b-4474-80c9-670bf6be8870-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.051999 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba66b462-c52b-4474-80c9-670bf6be8870-config-data\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.052028 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba66b462-c52b-4474-80c9-670bf6be8870-run-httpd\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.052056 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmk9h\" (UniqueName: \"kubernetes.io/projected/ba66b462-c52b-4474-80c9-670bf6be8870-kube-api-access-hmk9h\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.052080 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba66b462-c52b-4474-80c9-670bf6be8870-scripts\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.052166 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba66b462-c52b-4474-80c9-670bf6be8870-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.052228 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba66b462-c52b-4474-80c9-670bf6be8870-log-httpd\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.052330 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba66b462-c52b-4474-80c9-670bf6be8870-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.154916 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba66b462-c52b-4474-80c9-670bf6be8870-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.155065 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba66b462-c52b-4474-80c9-670bf6be8870-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.155143 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba66b462-c52b-4474-80c9-670bf6be8870-config-data\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.155175 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba66b462-c52b-4474-80c9-670bf6be8870-run-httpd\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.155213 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmk9h\" (UniqueName: \"kubernetes.io/projected/ba66b462-c52b-4474-80c9-670bf6be8870-kube-api-access-hmk9h\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.155245 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba66b462-c52b-4474-80c9-670bf6be8870-scripts\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.155336 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba66b462-c52b-4474-80c9-670bf6be8870-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.155374 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba66b462-c52b-4474-80c9-670bf6be8870-log-httpd\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.156017 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba66b462-c52b-4474-80c9-670bf6be8870-log-httpd\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.156360 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba66b462-c52b-4474-80c9-670bf6be8870-run-httpd\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.162842 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba66b462-c52b-4474-80c9-670bf6be8870-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.163433 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba66b462-c52b-4474-80c9-670bf6be8870-scripts\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.168121 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba66b462-c52b-4474-80c9-670bf6be8870-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.171663 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba66b462-c52b-4474-80c9-670bf6be8870-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.177909 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba66b462-c52b-4474-80c9-670bf6be8870-config-data\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.188475 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmk9h\" (UniqueName: \"kubernetes.io/projected/ba66b462-c52b-4474-80c9-670bf6be8870-kube-api-access-hmk9h\") pod \"ceilometer-0\" (UID: \"ba66b462-c52b-4474-80c9-670bf6be8870\") " pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.339597 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.547880 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="290ea525-330c-4eb2-ab44-e1fb7aa90384" path="/var/lib/kubelet/pods/290ea525-330c-4eb2-ab44-e1fb7aa90384/volumes" Nov 22 03:45:34 crc kubenswrapper[4952]: W1122 03:45:34.844630 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba66b462_c52b_4474_80c9_670bf6be8870.slice/crio-f5f374b7f179796f86cf916e23c57f81ced90dbc417bc4f2a77dca6ed8c33643 WatchSource:0}: Error finding container f5f374b7f179796f86cf916e23c57f81ced90dbc417bc4f2a77dca6ed8c33643: Status 404 returned error can't find the container with id f5f374b7f179796f86cf916e23c57f81ced90dbc417bc4f2a77dca6ed8c33643 Nov 22 03:45:34 crc kubenswrapper[4952]: I1122 03:45:34.846645 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:45:35 crc kubenswrapper[4952]: I1122 03:45:35.389908 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Nov 22 03:45:35 crc kubenswrapper[4952]: I1122 03:45:35.440469 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Nov 22 03:45:35 crc kubenswrapper[4952]: I1122 03:45:35.446945 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 03:45:35 crc kubenswrapper[4952]: I1122 03:45:35.488189 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 03:45:35 crc kubenswrapper[4952]: I1122 03:45:35.665098 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba66b462-c52b-4474-80c9-670bf6be8870","Type":"ContainerStarted","Data":"c21565680c74156ecf3f9c8f03c688603e116314d6601ed9cd5d62a4ecda977f"} Nov 22 03:45:35 crc kubenswrapper[4952]: I1122 03:45:35.665166 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba66b462-c52b-4474-80c9-670bf6be8870","Type":"ContainerStarted","Data":"f5f374b7f179796f86cf916e23c57f81ced90dbc417bc4f2a77dca6ed8c33643"} Nov 22 03:45:35 crc kubenswrapper[4952]: I1122 03:45:35.665272 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="511d091c-af24-4531-b382-304c3ee5ecff" containerName="manila-scheduler" containerID="cri-o://a2204f983935a8056dc5ff1c9ba5b7f67a88561af2d2b80fd7854d064b1cdcb0" gracePeriod=30 Nov 22 03:45:35 crc kubenswrapper[4952]: I1122 03:45:35.665406 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="511d091c-af24-4531-b382-304c3ee5ecff" containerName="probe" containerID="cri-o://71eaa9a82a8ad905bc424adfc104d52070a492c3f97b99f196f061834c2de3bf" gracePeriod=30 Nov 22 03:45:35 crc kubenswrapper[4952]: I1122 03:45:35.665450 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="e5f2b58e-2b8d-4efa-b37c-77717d271276" containerName="manila-share" containerID="cri-o://2e880f2f2ecafd54f5dab665e208e19bbcc8cabeda3a76db8d82c5f735d132b0" gracePeriod=30 Nov 22 03:45:35 crc kubenswrapper[4952]: I1122 03:45:35.665572 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="e5f2b58e-2b8d-4efa-b37c-77717d271276" containerName="probe" containerID="cri-o://78a1a535205af5d0c1f9537b10a254ed24438fe170e8609e8b69b22f13afeb55" gracePeriod=30 Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.473136 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.606045 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-scripts\") pod \"e5f2b58e-2b8d-4efa-b37c-77717d271276\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.606206 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wj6v\" (UniqueName: \"kubernetes.io/projected/e5f2b58e-2b8d-4efa-b37c-77717d271276-kube-api-access-5wj6v\") pod \"e5f2b58e-2b8d-4efa-b37c-77717d271276\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.606297 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-config-data\") pod \"e5f2b58e-2b8d-4efa-b37c-77717d271276\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.606374 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5f2b58e-2b8d-4efa-b37c-77717d271276-etc-machine-id\") pod \"e5f2b58e-2b8d-4efa-b37c-77717d271276\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.606406 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-combined-ca-bundle\") pod \"e5f2b58e-2b8d-4efa-b37c-77717d271276\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.606458 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-config-data-custom\") pod \"e5f2b58e-2b8d-4efa-b37c-77717d271276\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.606509 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/e5f2b58e-2b8d-4efa-b37c-77717d271276-var-lib-manila\") pod \"e5f2b58e-2b8d-4efa-b37c-77717d271276\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.606576 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e5f2b58e-2b8d-4efa-b37c-77717d271276-ceph\") pod \"e5f2b58e-2b8d-4efa-b37c-77717d271276\" (UID: \"e5f2b58e-2b8d-4efa-b37c-77717d271276\") " Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.609261 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5f2b58e-2b8d-4efa-b37c-77717d271276-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "e5f2b58e-2b8d-4efa-b37c-77717d271276" (UID: "e5f2b58e-2b8d-4efa-b37c-77717d271276"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.610938 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5f2b58e-2b8d-4efa-b37c-77717d271276-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e5f2b58e-2b8d-4efa-b37c-77717d271276" (UID: "e5f2b58e-2b8d-4efa-b37c-77717d271276"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.614777 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-scripts" (OuterVolumeSpecName: "scripts") pod "e5f2b58e-2b8d-4efa-b37c-77717d271276" (UID: "e5f2b58e-2b8d-4efa-b37c-77717d271276"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.615053 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f2b58e-2b8d-4efa-b37c-77717d271276-kube-api-access-5wj6v" (OuterVolumeSpecName: "kube-api-access-5wj6v") pod "e5f2b58e-2b8d-4efa-b37c-77717d271276" (UID: "e5f2b58e-2b8d-4efa-b37c-77717d271276"). InnerVolumeSpecName "kube-api-access-5wj6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.619910 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f2b58e-2b8d-4efa-b37c-77717d271276-ceph" (OuterVolumeSpecName: "ceph") pod "e5f2b58e-2b8d-4efa-b37c-77717d271276" (UID: "e5f2b58e-2b8d-4efa-b37c-77717d271276"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.620780 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e5f2b58e-2b8d-4efa-b37c-77717d271276" (UID: "e5f2b58e-2b8d-4efa-b37c-77717d271276"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.674300 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba66b462-c52b-4474-80c9-670bf6be8870","Type":"ContainerStarted","Data":"cb09209ad57b119c2e6e0e4881d14b58adcadd8df0131d72668a8bd0387887b1"} Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.676973 4952 generic.go:334] "Generic (PLEG): container finished" podID="e5f2b58e-2b8d-4efa-b37c-77717d271276" containerID="78a1a535205af5d0c1f9537b10a254ed24438fe170e8609e8b69b22f13afeb55" exitCode=0 Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.677119 4952 generic.go:334] "Generic (PLEG): container finished" podID="e5f2b58e-2b8d-4efa-b37c-77717d271276" containerID="2e880f2f2ecafd54f5dab665e208e19bbcc8cabeda3a76db8d82c5f735d132b0" exitCode=1 Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.677207 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"e5f2b58e-2b8d-4efa-b37c-77717d271276","Type":"ContainerDied","Data":"78a1a535205af5d0c1f9537b10a254ed24438fe170e8609e8b69b22f13afeb55"} Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.677285 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"e5f2b58e-2b8d-4efa-b37c-77717d271276","Type":"ContainerDied","Data":"2e880f2f2ecafd54f5dab665e208e19bbcc8cabeda3a76db8d82c5f735d132b0"} Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.677343 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"e5f2b58e-2b8d-4efa-b37c-77717d271276","Type":"ContainerDied","Data":"87a7cbe3cb60ea215332f4a30105294dc71d3955c86ad80db23150f7a57225ff"} Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.677406 4952 scope.go:117] "RemoveContainer" containerID="78a1a535205af5d0c1f9537b10a254ed24438fe170e8609e8b69b22f13afeb55" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.677396 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5f2b58e-2b8d-4efa-b37c-77717d271276" (UID: "e5f2b58e-2b8d-4efa-b37c-77717d271276"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.677530 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.679815 4952 generic.go:334] "Generic (PLEG): container finished" podID="511d091c-af24-4531-b382-304c3ee5ecff" containerID="71eaa9a82a8ad905bc424adfc104d52070a492c3f97b99f196f061834c2de3bf" exitCode=0 Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.679921 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"511d091c-af24-4531-b382-304c3ee5ecff","Type":"ContainerDied","Data":"71eaa9a82a8ad905bc424adfc104d52070a492c3f97b99f196f061834c2de3bf"} Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.700017 4952 scope.go:117] "RemoveContainer" containerID="2e880f2f2ecafd54f5dab665e208e19bbcc8cabeda3a76db8d82c5f735d132b0" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.709287 4952 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/e5f2b58e-2b8d-4efa-b37c-77717d271276-var-lib-manila\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.709448 4952 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e5f2b58e-2b8d-4efa-b37c-77717d271276-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.709508 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.709586 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wj6v\" (UniqueName: \"kubernetes.io/projected/e5f2b58e-2b8d-4efa-b37c-77717d271276-kube-api-access-5wj6v\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.709670 4952 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5f2b58e-2b8d-4efa-b37c-77717d271276-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.709726 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.709788 4952 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.720039 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-config-data" (OuterVolumeSpecName: "config-data") pod "e5f2b58e-2b8d-4efa-b37c-77717d271276" (UID: "e5f2b58e-2b8d-4efa-b37c-77717d271276"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.750391 4952 scope.go:117] "RemoveContainer" containerID="78a1a535205af5d0c1f9537b10a254ed24438fe170e8609e8b69b22f13afeb55" Nov 22 03:45:36 crc kubenswrapper[4952]: E1122 03:45:36.750948 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78a1a535205af5d0c1f9537b10a254ed24438fe170e8609e8b69b22f13afeb55\": container with ID starting with 78a1a535205af5d0c1f9537b10a254ed24438fe170e8609e8b69b22f13afeb55 not found: ID does not exist" containerID="78a1a535205af5d0c1f9537b10a254ed24438fe170e8609e8b69b22f13afeb55" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.750990 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78a1a535205af5d0c1f9537b10a254ed24438fe170e8609e8b69b22f13afeb55"} err="failed to get container status \"78a1a535205af5d0c1f9537b10a254ed24438fe170e8609e8b69b22f13afeb55\": rpc error: code = NotFound desc = could not find container \"78a1a535205af5d0c1f9537b10a254ed24438fe170e8609e8b69b22f13afeb55\": container with ID starting with 78a1a535205af5d0c1f9537b10a254ed24438fe170e8609e8b69b22f13afeb55 not found: ID does not exist" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.751014 4952 scope.go:117] "RemoveContainer" containerID="2e880f2f2ecafd54f5dab665e208e19bbcc8cabeda3a76db8d82c5f735d132b0" Nov 22 03:45:36 crc kubenswrapper[4952]: E1122 03:45:36.751344 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e880f2f2ecafd54f5dab665e208e19bbcc8cabeda3a76db8d82c5f735d132b0\": container with ID starting with 2e880f2f2ecafd54f5dab665e208e19bbcc8cabeda3a76db8d82c5f735d132b0 not found: ID does not exist" containerID="2e880f2f2ecafd54f5dab665e208e19bbcc8cabeda3a76db8d82c5f735d132b0" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.751376 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e880f2f2ecafd54f5dab665e208e19bbcc8cabeda3a76db8d82c5f735d132b0"} err="failed to get container status \"2e880f2f2ecafd54f5dab665e208e19bbcc8cabeda3a76db8d82c5f735d132b0\": rpc error: code = NotFound desc = could not find container \"2e880f2f2ecafd54f5dab665e208e19bbcc8cabeda3a76db8d82c5f735d132b0\": container with ID starting with 2e880f2f2ecafd54f5dab665e208e19bbcc8cabeda3a76db8d82c5f735d132b0 not found: ID does not exist" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.751395 4952 scope.go:117] "RemoveContainer" containerID="78a1a535205af5d0c1f9537b10a254ed24438fe170e8609e8b69b22f13afeb55" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.751683 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78a1a535205af5d0c1f9537b10a254ed24438fe170e8609e8b69b22f13afeb55"} err="failed to get container status \"78a1a535205af5d0c1f9537b10a254ed24438fe170e8609e8b69b22f13afeb55\": rpc error: code = NotFound desc = could not find container \"78a1a535205af5d0c1f9537b10a254ed24438fe170e8609e8b69b22f13afeb55\": container with ID starting with 78a1a535205af5d0c1f9537b10a254ed24438fe170e8609e8b69b22f13afeb55 not found: ID does not exist" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.751709 4952 scope.go:117] "RemoveContainer" containerID="2e880f2f2ecafd54f5dab665e208e19bbcc8cabeda3a76db8d82c5f735d132b0" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.751964 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e880f2f2ecafd54f5dab665e208e19bbcc8cabeda3a76db8d82c5f735d132b0"} err="failed to get container status \"2e880f2f2ecafd54f5dab665e208e19bbcc8cabeda3a76db8d82c5f735d132b0\": rpc error: code = NotFound desc = could not find container \"2e880f2f2ecafd54f5dab665e208e19bbcc8cabeda3a76db8d82c5f735d132b0\": container with ID starting with 2e880f2f2ecafd54f5dab665e208e19bbcc8cabeda3a76db8d82c5f735d132b0 not found: ID does not exist" Nov 22 03:45:36 crc kubenswrapper[4952]: I1122 03:45:36.812219 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f2b58e-2b8d-4efa-b37c-77717d271276-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.157722 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.185571 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.214308 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 03:45:37 crc kubenswrapper[4952]: E1122 03:45:37.215053 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f2b58e-2b8d-4efa-b37c-77717d271276" containerName="manila-share" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.215158 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f2b58e-2b8d-4efa-b37c-77717d271276" containerName="manila-share" Nov 22 03:45:37 crc kubenswrapper[4952]: E1122 03:45:37.215287 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f2b58e-2b8d-4efa-b37c-77717d271276" containerName="probe" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.215365 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f2b58e-2b8d-4efa-b37c-77717d271276" containerName="probe" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.215736 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f2b58e-2b8d-4efa-b37c-77717d271276" containerName="probe" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.215869 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f2b58e-2b8d-4efa-b37c-77717d271276" containerName="manila-share" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.217239 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.221882 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.222944 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.322523 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk246\" (UniqueName: \"kubernetes.io/projected/02064695-893b-4036-83b5-17863ffb7028-kube-api-access-kk246\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.322613 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02064695-893b-4036-83b5-17863ffb7028-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.322637 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02064695-893b-4036-83b5-17863ffb7028-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.322684 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/02064695-893b-4036-83b5-17863ffb7028-ceph\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.322706 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02064695-893b-4036-83b5-17863ffb7028-scripts\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.322787 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02064695-893b-4036-83b5-17863ffb7028-config-data\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.322818 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/02064695-893b-4036-83b5-17863ffb7028-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.322847 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02064695-893b-4036-83b5-17863ffb7028-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.424386 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk246\" (UniqueName: \"kubernetes.io/projected/02064695-893b-4036-83b5-17863ffb7028-kube-api-access-kk246\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.424469 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02064695-893b-4036-83b5-17863ffb7028-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.424497 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02064695-893b-4036-83b5-17863ffb7028-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.424560 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/02064695-893b-4036-83b5-17863ffb7028-ceph\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.424587 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02064695-893b-4036-83b5-17863ffb7028-scripts\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.424633 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02064695-893b-4036-83b5-17863ffb7028-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.424682 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02064695-893b-4036-83b5-17863ffb7028-config-data\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.424873 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/02064695-893b-4036-83b5-17863ffb7028-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.424943 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02064695-893b-4036-83b5-17863ffb7028-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.425063 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/02064695-893b-4036-83b5-17863ffb7028-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.429454 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/02064695-893b-4036-83b5-17863ffb7028-ceph\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.429615 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02064695-893b-4036-83b5-17863ffb7028-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.430027 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02064695-893b-4036-83b5-17863ffb7028-config-data\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.430169 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02064695-893b-4036-83b5-17863ffb7028-scripts\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.438619 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02064695-893b-4036-83b5-17863ffb7028-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.445710 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk246\" (UniqueName: \"kubernetes.io/projected/02064695-893b-4036-83b5-17863ffb7028-kube-api-access-kk246\") pod \"manila-share-share1-0\" (UID: \"02064695-893b-4036-83b5-17863ffb7028\") " pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.538721 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 22 03:45:37 crc kubenswrapper[4952]: I1122 03:45:37.718999 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba66b462-c52b-4474-80c9-670bf6be8870","Type":"ContainerStarted","Data":"d4c5338069679af005f19d0d1bf4c3a666f51a30fb6fbae5de2a726cee68b3ef"} Nov 22 03:45:38 crc kubenswrapper[4952]: I1122 03:45:38.105480 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 03:45:38 crc kubenswrapper[4952]: W1122 03:45:38.117721 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02064695_893b_4036_83b5_17863ffb7028.slice/crio-30662fe39e719563e04a29e0b278a0100542593130442bc336ca92b79c754f5b WatchSource:0}: Error finding container 30662fe39e719563e04a29e0b278a0100542593130442bc336ca92b79c754f5b: Status 404 returned error can't find the container with id 30662fe39e719563e04a29e0b278a0100542593130442bc336ca92b79c754f5b Nov 22 03:45:38 crc kubenswrapper[4952]: I1122 03:45:38.555372 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f2b58e-2b8d-4efa-b37c-77717d271276" path="/var/lib/kubelet/pods/e5f2b58e-2b8d-4efa-b37c-77717d271276/volumes" Nov 22 03:45:38 crc kubenswrapper[4952]: I1122 03:45:38.728923 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"02064695-893b-4036-83b5-17863ffb7028","Type":"ContainerStarted","Data":"30662fe39e719563e04a29e0b278a0100542593130442bc336ca92b79c754f5b"} Nov 22 03:45:39 crc kubenswrapper[4952]: I1122 03:45:39.747921 4952 generic.go:334] "Generic (PLEG): container finished" podID="511d091c-af24-4531-b382-304c3ee5ecff" containerID="a2204f983935a8056dc5ff1c9ba5b7f67a88561af2d2b80fd7854d064b1cdcb0" exitCode=0 Nov 22 03:45:39 crc kubenswrapper[4952]: I1122 03:45:39.748361 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"511d091c-af24-4531-b382-304c3ee5ecff","Type":"ContainerDied","Data":"a2204f983935a8056dc5ff1c9ba5b7f67a88561af2d2b80fd7854d064b1cdcb0"} Nov 22 03:45:39 crc kubenswrapper[4952]: I1122 03:45:39.750793 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"02064695-893b-4036-83b5-17863ffb7028","Type":"ContainerStarted","Data":"a12857173fcc565c27b81fefa5a01c22c7c01e20b066bace56f57780266243ff"} Nov 22 03:45:39 crc kubenswrapper[4952]: I1122 03:45:39.753976 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba66b462-c52b-4474-80c9-670bf6be8870","Type":"ContainerStarted","Data":"09ab67f07b5b35caf0d105fdd38a1309205c9e213ffbba4b1a79daded35ce71a"} Nov 22 03:45:39 crc kubenswrapper[4952]: I1122 03:45:39.754471 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 03:45:39 crc kubenswrapper[4952]: I1122 03:45:39.782341 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.516531826 podStartE2EDuration="6.782321202s" podCreationTimestamp="2025-11-22 03:45:33 +0000 UTC" firstStartedPulling="2025-11-22 03:45:34.848682755 +0000 UTC m=+3099.154700028" lastFinishedPulling="2025-11-22 03:45:39.114472131 +0000 UTC m=+3103.420489404" observedRunningTime="2025-11-22 03:45:39.77393338 +0000 UTC m=+3104.079950653" watchObservedRunningTime="2025-11-22 03:45:39.782321202 +0000 UTC m=+3104.088338475" Nov 22 03:45:39 crc kubenswrapper[4952]: I1122 03:45:39.848158 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 22 03:45:39 crc kubenswrapper[4952]: I1122 03:45:39.988971 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-scripts\") pod \"511d091c-af24-4531-b382-304c3ee5ecff\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " Nov 22 03:45:39 crc kubenswrapper[4952]: I1122 03:45:39.989126 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-combined-ca-bundle\") pod \"511d091c-af24-4531-b382-304c3ee5ecff\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " Nov 22 03:45:39 crc kubenswrapper[4952]: I1122 03:45:39.989160 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-config-data-custom\") pod \"511d091c-af24-4531-b382-304c3ee5ecff\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " Nov 22 03:45:39 crc kubenswrapper[4952]: I1122 03:45:39.989219 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mr7z\" (UniqueName: \"kubernetes.io/projected/511d091c-af24-4531-b382-304c3ee5ecff-kube-api-access-9mr7z\") pod \"511d091c-af24-4531-b382-304c3ee5ecff\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " Nov 22 03:45:39 crc kubenswrapper[4952]: I1122 03:45:39.989329 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-config-data\") pod \"511d091c-af24-4531-b382-304c3ee5ecff\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " Nov 22 03:45:39 crc kubenswrapper[4952]: I1122 03:45:39.989372 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/511d091c-af24-4531-b382-304c3ee5ecff-etc-machine-id\") pod \"511d091c-af24-4531-b382-304c3ee5ecff\" (UID: \"511d091c-af24-4531-b382-304c3ee5ecff\") " Nov 22 03:45:39 crc kubenswrapper[4952]: I1122 03:45:39.989564 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/511d091c-af24-4531-b382-304c3ee5ecff-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "511d091c-af24-4531-b382-304c3ee5ecff" (UID: "511d091c-af24-4531-b382-304c3ee5ecff"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:45:39 crc kubenswrapper[4952]: I1122 03:45:39.995330 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "511d091c-af24-4531-b382-304c3ee5ecff" (UID: "511d091c-af24-4531-b382-304c3ee5ecff"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:39 crc kubenswrapper[4952]: I1122 03:45:39.995596 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-scripts" (OuterVolumeSpecName: "scripts") pod "511d091c-af24-4531-b382-304c3ee5ecff" (UID: "511d091c-af24-4531-b382-304c3ee5ecff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:39 crc kubenswrapper[4952]: I1122 03:45:39.996751 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/511d091c-af24-4531-b382-304c3ee5ecff-kube-api-access-9mr7z" (OuterVolumeSpecName: "kube-api-access-9mr7z") pod "511d091c-af24-4531-b382-304c3ee5ecff" (UID: "511d091c-af24-4531-b382-304c3ee5ecff"). InnerVolumeSpecName "kube-api-access-9mr7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.053653 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "511d091c-af24-4531-b382-304c3ee5ecff" (UID: "511d091c-af24-4531-b382-304c3ee5ecff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.091720 4952 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/511d091c-af24-4531-b382-304c3ee5ecff-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.091764 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.091777 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.091789 4952 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.091802 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mr7z\" (UniqueName: \"kubernetes.io/projected/511d091c-af24-4531-b382-304c3ee5ecff-kube-api-access-9mr7z\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.104894 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-config-data" (OuterVolumeSpecName: "config-data") pod "511d091c-af24-4531-b382-304c3ee5ecff" (UID: "511d091c-af24-4531-b382-304c3ee5ecff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.193561 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/511d091c-af24-4531-b382-304c3ee5ecff-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.766672 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.766682 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"511d091c-af24-4531-b382-304c3ee5ecff","Type":"ContainerDied","Data":"b647dea27fc1e995c37a1425c6136abeb1962947a8ceed4303df2c8be636149b"} Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.767145 4952 scope.go:117] "RemoveContainer" containerID="71eaa9a82a8ad905bc424adfc104d52070a492c3f97b99f196f061834c2de3bf" Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.770410 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"02064695-893b-4036-83b5-17863ffb7028","Type":"ContainerStarted","Data":"16723012c0b6eceec8ad25e2e727819e7a980554f9b8a2b785016ec751a2b792"} Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.811509 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.811595 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.846336 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 03:45:40 crc kubenswrapper[4952]: E1122 03:45:40.846796 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="511d091c-af24-4531-b382-304c3ee5ecff" containerName="manila-scheduler" Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.846813 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="511d091c-af24-4531-b382-304c3ee5ecff" containerName="manila-scheduler" Nov 22 03:45:40 crc kubenswrapper[4952]: E1122 03:45:40.846837 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="511d091c-af24-4531-b382-304c3ee5ecff" containerName="probe" Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.846845 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="511d091c-af24-4531-b382-304c3ee5ecff" containerName="probe" Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.847075 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="511d091c-af24-4531-b382-304c3ee5ecff" containerName="manila-scheduler" Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.847100 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="511d091c-af24-4531-b382-304c3ee5ecff" containerName="probe" Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.848334 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.850470 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.852728 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.852713118 podStartE2EDuration="3.852713118s" podCreationTimestamp="2025-11-22 03:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:45:40.818199672 +0000 UTC m=+3105.124216945" watchObservedRunningTime="2025-11-22 03:45:40.852713118 +0000 UTC m=+3105.158730391" Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.865468 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 03:45:40 crc kubenswrapper[4952]: I1122 03:45:40.880794 4952 scope.go:117] "RemoveContainer" containerID="a2204f983935a8056dc5ff1c9ba5b7f67a88561af2d2b80fd7854d064b1cdcb0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.019732 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21fe7ebd-1970-410b-948e-1837b5d6295b-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"21fe7ebd-1970-410b-948e-1837b5d6295b\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.019804 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21fe7ebd-1970-410b-948e-1837b5d6295b-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"21fe7ebd-1970-410b-948e-1837b5d6295b\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.019917 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21fe7ebd-1970-410b-948e-1837b5d6295b-scripts\") pod \"manila-scheduler-0\" (UID: \"21fe7ebd-1970-410b-948e-1837b5d6295b\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.019949 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21fe7ebd-1970-410b-948e-1837b5d6295b-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"21fe7ebd-1970-410b-948e-1837b5d6295b\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.019975 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tgz5\" (UniqueName: \"kubernetes.io/projected/21fe7ebd-1970-410b-948e-1837b5d6295b-kube-api-access-4tgz5\") pod \"manila-scheduler-0\" (UID: \"21fe7ebd-1970-410b-948e-1837b5d6295b\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.020007 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21fe7ebd-1970-410b-948e-1837b5d6295b-config-data\") pod \"manila-scheduler-0\" (UID: \"21fe7ebd-1970-410b-948e-1837b5d6295b\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.055168 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.122319 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21fe7ebd-1970-410b-948e-1837b5d6295b-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"21fe7ebd-1970-410b-948e-1837b5d6295b\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.122403 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21fe7ebd-1970-410b-948e-1837b5d6295b-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"21fe7ebd-1970-410b-948e-1837b5d6295b\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.122513 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21fe7ebd-1970-410b-948e-1837b5d6295b-scripts\") pod \"manila-scheduler-0\" (UID: \"21fe7ebd-1970-410b-948e-1837b5d6295b\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.122559 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21fe7ebd-1970-410b-948e-1837b5d6295b-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"21fe7ebd-1970-410b-948e-1837b5d6295b\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.122584 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tgz5\" (UniqueName: \"kubernetes.io/projected/21fe7ebd-1970-410b-948e-1837b5d6295b-kube-api-access-4tgz5\") pod \"manila-scheduler-0\" (UID: \"21fe7ebd-1970-410b-948e-1837b5d6295b\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.122607 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21fe7ebd-1970-410b-948e-1837b5d6295b-config-data\") pod \"manila-scheduler-0\" (UID: \"21fe7ebd-1970-410b-948e-1837b5d6295b\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.122600 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21fe7ebd-1970-410b-948e-1837b5d6295b-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"21fe7ebd-1970-410b-948e-1837b5d6295b\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.129327 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21fe7ebd-1970-410b-948e-1837b5d6295b-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"21fe7ebd-1970-410b-948e-1837b5d6295b\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.129529 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21fe7ebd-1970-410b-948e-1837b5d6295b-config-data\") pod \"manila-scheduler-0\" (UID: \"21fe7ebd-1970-410b-948e-1837b5d6295b\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.129759 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21fe7ebd-1970-410b-948e-1837b5d6295b-scripts\") pod \"manila-scheduler-0\" (UID: \"21fe7ebd-1970-410b-948e-1837b5d6295b\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.149143 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21fe7ebd-1970-410b-948e-1837b5d6295b-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"21fe7ebd-1970-410b-948e-1837b5d6295b\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.149642 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tgz5\" (UniqueName: \"kubernetes.io/projected/21fe7ebd-1970-410b-948e-1837b5d6295b-kube-api-access-4tgz5\") pod \"manila-scheduler-0\" (UID: \"21fe7ebd-1970-410b-948e-1837b5d6295b\") " pod="openstack/manila-scheduler-0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.199513 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.698716 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5675778f5b-wg7px" podUID="d8eb2643-4d7c-4814-91fe-1192d3fc753d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.240:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.240:8443: connect: connection refused" Nov 22 03:45:41 crc kubenswrapper[4952]: I1122 03:45:41.781018 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 03:45:41 crc kubenswrapper[4952]: W1122 03:45:41.784409 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21fe7ebd_1970_410b_948e_1837b5d6295b.slice/crio-027830cd6e09774d5d75c1c1dad3e04969828d2f22b17fbb2b7a815e9105d9c9 WatchSource:0}: Error finding container 027830cd6e09774d5d75c1c1dad3e04969828d2f22b17fbb2b7a815e9105d9c9: Status 404 returned error can't find the container with id 027830cd6e09774d5d75c1c1dad3e04969828d2f22b17fbb2b7a815e9105d9c9 Nov 22 03:45:42 crc kubenswrapper[4952]: I1122 03:45:42.547508 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="511d091c-af24-4531-b382-304c3ee5ecff" path="/var/lib/kubelet/pods/511d091c-af24-4531-b382-304c3ee5ecff/volumes" Nov 22 03:45:42 crc kubenswrapper[4952]: I1122 03:45:42.804636 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"21fe7ebd-1970-410b-948e-1837b5d6295b","Type":"ContainerStarted","Data":"b59c022beec0956ab061879390eff09d84ea0c8a35568a1216609e0be8cd296e"} Nov 22 03:45:42 crc kubenswrapper[4952]: I1122 03:45:42.804689 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"21fe7ebd-1970-410b-948e-1837b5d6295b","Type":"ContainerStarted","Data":"f40e0ffdef4f5e695feeab4d5e6cf399cede3a64e2b33ff1148c50ec9c641c20"} Nov 22 03:45:42 crc kubenswrapper[4952]: I1122 03:45:42.804701 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"21fe7ebd-1970-410b-948e-1837b5d6295b","Type":"ContainerStarted","Data":"027830cd6e09774d5d75c1c1dad3e04969828d2f22b17fbb2b7a815e9105d9c9"} Nov 22 03:45:42 crc kubenswrapper[4952]: I1122 03:45:42.838676 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=2.838658784 podStartE2EDuration="2.838658784s" podCreationTimestamp="2025-11-22 03:45:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:45:42.8313589 +0000 UTC m=+3107.137376203" watchObservedRunningTime="2025-11-22 03:45:42.838658784 +0000 UTC m=+3107.144676057" Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.587927 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.769537 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8eb2643-4d7c-4814-91fe-1192d3fc753d-config-data\") pod \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.769632 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8eb2643-4d7c-4814-91fe-1192d3fc753d-logs\") pod \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.769681 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8eb2643-4d7c-4814-91fe-1192d3fc753d-combined-ca-bundle\") pod \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.769733 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8eb2643-4d7c-4814-91fe-1192d3fc753d-scripts\") pod \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.769792 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgq8l\" (UniqueName: \"kubernetes.io/projected/d8eb2643-4d7c-4814-91fe-1192d3fc753d-kube-api-access-zgq8l\") pod \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.769888 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d8eb2643-4d7c-4814-91fe-1192d3fc753d-horizon-secret-key\") pod \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.769955 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8eb2643-4d7c-4814-91fe-1192d3fc753d-horizon-tls-certs\") pod \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\" (UID: \"d8eb2643-4d7c-4814-91fe-1192d3fc753d\") " Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.770520 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8eb2643-4d7c-4814-91fe-1192d3fc753d-logs" (OuterVolumeSpecName: "logs") pod "d8eb2643-4d7c-4814-91fe-1192d3fc753d" (UID: "d8eb2643-4d7c-4814-91fe-1192d3fc753d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.778021 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8eb2643-4d7c-4814-91fe-1192d3fc753d-kube-api-access-zgq8l" (OuterVolumeSpecName: "kube-api-access-zgq8l") pod "d8eb2643-4d7c-4814-91fe-1192d3fc753d" (UID: "d8eb2643-4d7c-4814-91fe-1192d3fc753d"). InnerVolumeSpecName "kube-api-access-zgq8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.778402 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8eb2643-4d7c-4814-91fe-1192d3fc753d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d8eb2643-4d7c-4814-91fe-1192d3fc753d" (UID: "d8eb2643-4d7c-4814-91fe-1192d3fc753d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.801027 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8eb2643-4d7c-4814-91fe-1192d3fc753d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8eb2643-4d7c-4814-91fe-1192d3fc753d" (UID: "d8eb2643-4d7c-4814-91fe-1192d3fc753d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.804112 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8eb2643-4d7c-4814-91fe-1192d3fc753d-config-data" (OuterVolumeSpecName: "config-data") pod "d8eb2643-4d7c-4814-91fe-1192d3fc753d" (UID: "d8eb2643-4d7c-4814-91fe-1192d3fc753d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.812196 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8eb2643-4d7c-4814-91fe-1192d3fc753d-scripts" (OuterVolumeSpecName: "scripts") pod "d8eb2643-4d7c-4814-91fe-1192d3fc753d" (UID: "d8eb2643-4d7c-4814-91fe-1192d3fc753d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.829300 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8eb2643-4d7c-4814-91fe-1192d3fc753d-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "d8eb2643-4d7c-4814-91fe-1192d3fc753d" (UID: "d8eb2643-4d7c-4814-91fe-1192d3fc753d"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.847391 4952 generic.go:334] "Generic (PLEG): container finished" podID="d8eb2643-4d7c-4814-91fe-1192d3fc753d" containerID="fcb3f4c158aff30431cd1c8359c3e7a82cf2d483250cad37132024d0df9ad257" exitCode=137 Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.847441 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5675778f5b-wg7px" event={"ID":"d8eb2643-4d7c-4814-91fe-1192d3fc753d","Type":"ContainerDied","Data":"fcb3f4c158aff30431cd1c8359c3e7a82cf2d483250cad37132024d0df9ad257"} Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.847473 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5675778f5b-wg7px" event={"ID":"d8eb2643-4d7c-4814-91fe-1192d3fc753d","Type":"ContainerDied","Data":"7e6136528ed73b43e4e674675c3fa3ed5abea6dbcad3ec5608d87440097114d6"} Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.847495 4952 scope.go:117] "RemoveContainer" containerID="3048b2f97eb28bedd926106b8d79a7305914bbde006d9e89768a2659d575f71f" Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.847494 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5675778f5b-wg7px" Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.872808 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8eb2643-4d7c-4814-91fe-1192d3fc753d-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.872977 4952 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8eb2643-4d7c-4814-91fe-1192d3fc753d-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.873060 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8eb2643-4d7c-4814-91fe-1192d3fc753d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.873145 4952 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8eb2643-4d7c-4814-91fe-1192d3fc753d-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.873249 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgq8l\" (UniqueName: \"kubernetes.io/projected/d8eb2643-4d7c-4814-91fe-1192d3fc753d-kube-api-access-zgq8l\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.873365 4952 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d8eb2643-4d7c-4814-91fe-1192d3fc753d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.874124 4952 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8eb2643-4d7c-4814-91fe-1192d3fc753d-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.908492 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5675778f5b-wg7px"] Nov 22 03:45:46 crc kubenswrapper[4952]: I1122 03:45:46.916233 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5675778f5b-wg7px"] Nov 22 03:45:47 crc kubenswrapper[4952]: I1122 03:45:47.081848 4952 scope.go:117] "RemoveContainer" containerID="fcb3f4c158aff30431cd1c8359c3e7a82cf2d483250cad37132024d0df9ad257" Nov 22 03:45:47 crc kubenswrapper[4952]: I1122 03:45:47.117835 4952 scope.go:117] "RemoveContainer" containerID="3048b2f97eb28bedd926106b8d79a7305914bbde006d9e89768a2659d575f71f" Nov 22 03:45:47 crc kubenswrapper[4952]: E1122 03:45:47.118581 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3048b2f97eb28bedd926106b8d79a7305914bbde006d9e89768a2659d575f71f\": container with ID starting with 3048b2f97eb28bedd926106b8d79a7305914bbde006d9e89768a2659d575f71f not found: ID does not exist" containerID="3048b2f97eb28bedd926106b8d79a7305914bbde006d9e89768a2659d575f71f" Nov 22 03:45:47 crc kubenswrapper[4952]: I1122 03:45:47.118643 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3048b2f97eb28bedd926106b8d79a7305914bbde006d9e89768a2659d575f71f"} err="failed to get container status \"3048b2f97eb28bedd926106b8d79a7305914bbde006d9e89768a2659d575f71f\": rpc error: code = NotFound desc = could not find container \"3048b2f97eb28bedd926106b8d79a7305914bbde006d9e89768a2659d575f71f\": container with ID starting with 3048b2f97eb28bedd926106b8d79a7305914bbde006d9e89768a2659d575f71f not found: ID does not exist" Nov 22 03:45:47 crc kubenswrapper[4952]: I1122 03:45:47.118678 4952 scope.go:117] "RemoveContainer" containerID="fcb3f4c158aff30431cd1c8359c3e7a82cf2d483250cad37132024d0df9ad257" Nov 22 03:45:47 crc kubenswrapper[4952]: E1122 03:45:47.118983 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcb3f4c158aff30431cd1c8359c3e7a82cf2d483250cad37132024d0df9ad257\": container with ID starting with fcb3f4c158aff30431cd1c8359c3e7a82cf2d483250cad37132024d0df9ad257 not found: ID does not exist" containerID="fcb3f4c158aff30431cd1c8359c3e7a82cf2d483250cad37132024d0df9ad257" Nov 22 03:45:47 crc kubenswrapper[4952]: I1122 03:45:47.119028 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb3f4c158aff30431cd1c8359c3e7a82cf2d483250cad37132024d0df9ad257"} err="failed to get container status \"fcb3f4c158aff30431cd1c8359c3e7a82cf2d483250cad37132024d0df9ad257\": rpc error: code = NotFound desc = could not find container \"fcb3f4c158aff30431cd1c8359c3e7a82cf2d483250cad37132024d0df9ad257\": container with ID starting with fcb3f4c158aff30431cd1c8359c3e7a82cf2d483250cad37132024d0df9ad257 not found: ID does not exist" Nov 22 03:45:47 crc kubenswrapper[4952]: I1122 03:45:47.539629 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Nov 22 03:45:48 crc kubenswrapper[4952]: I1122 03:45:48.531516 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:45:48 crc kubenswrapper[4952]: E1122 03:45:48.532357 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:45:48 crc kubenswrapper[4952]: I1122 03:45:48.553222 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8eb2643-4d7c-4814-91fe-1192d3fc753d" path="/var/lib/kubelet/pods/d8eb2643-4d7c-4814-91fe-1192d3fc753d/volumes" Nov 22 03:45:51 crc kubenswrapper[4952]: I1122 03:45:51.200786 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Nov 22 03:45:59 crc kubenswrapper[4952]: I1122 03:45:59.027568 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Nov 22 03:46:01 crc kubenswrapper[4952]: I1122 03:46:01.531485 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:46:01 crc kubenswrapper[4952]: E1122 03:46:01.532071 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:46:02 crc kubenswrapper[4952]: I1122 03:46:02.685787 4952 scope.go:117] "RemoveContainer" containerID="7546d98f14d72669632d1a7dede418be0031153b6253bbb381f71b05b1386da5" Nov 22 03:46:02 crc kubenswrapper[4952]: I1122 03:46:02.781265 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Nov 22 03:46:04 crc kubenswrapper[4952]: I1122 03:46:04.347982 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 22 03:46:16 crc kubenswrapper[4952]: I1122 03:46:16.544150 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:46:16 crc kubenswrapper[4952]: E1122 03:46:16.545686 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:46:31 crc kubenswrapper[4952]: I1122 03:46:31.531796 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:46:31 crc kubenswrapper[4952]: E1122 03:46:31.532511 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:46:46 crc kubenswrapper[4952]: I1122 03:46:46.544310 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:46:46 crc kubenswrapper[4952]: E1122 03:46:46.545525 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:46:51 crc kubenswrapper[4952]: I1122 03:46:51.842953 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Nov 22 03:46:51 crc kubenswrapper[4952]: E1122 03:46:51.844278 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8eb2643-4d7c-4814-91fe-1192d3fc753d" containerName="horizon" Nov 22 03:46:51 crc kubenswrapper[4952]: I1122 03:46:51.844295 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8eb2643-4d7c-4814-91fe-1192d3fc753d" containerName="horizon" Nov 22 03:46:51 crc kubenswrapper[4952]: E1122 03:46:51.844317 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8eb2643-4d7c-4814-91fe-1192d3fc753d" containerName="horizon-log" Nov 22 03:46:51 crc kubenswrapper[4952]: I1122 03:46:51.844323 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8eb2643-4d7c-4814-91fe-1192d3fc753d" containerName="horizon-log" Nov 22 03:46:51 crc kubenswrapper[4952]: I1122 03:46:51.844613 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8eb2643-4d7c-4814-91fe-1192d3fc753d" containerName="horizon-log" Nov 22 03:46:51 crc kubenswrapper[4952]: I1122 03:46:51.844628 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8eb2643-4d7c-4814-91fe-1192d3fc753d" containerName="horizon" Nov 22 03:46:51 crc kubenswrapper[4952]: I1122 03:46:51.845566 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 22 03:46:51 crc kubenswrapper[4952]: I1122 03:46:51.849197 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-z7k5c" Nov 22 03:46:51 crc kubenswrapper[4952]: I1122 03:46:51.849423 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Nov 22 03:46:51 crc kubenswrapper[4952]: I1122 03:46:51.849581 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Nov 22 03:46:51 crc kubenswrapper[4952]: I1122 03:46:51.849904 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 22 03:46:51 crc kubenswrapper[4952]: I1122 03:46:51.861041 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 22 03:46:51 crc kubenswrapper[4952]: I1122 03:46:51.975028 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b32e0459-7aee-4841-8281-da334fe3e8d8-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:51 crc kubenswrapper[4952]: I1122 03:46:51.975095 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjpx4\" (UniqueName: \"kubernetes.io/projected/b32e0459-7aee-4841-8281-da334fe3e8d8-kube-api-access-kjpx4\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:51 crc kubenswrapper[4952]: I1122 03:46:51.975135 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:51 crc kubenswrapper[4952]: I1122 03:46:51.975336 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b32e0459-7aee-4841-8281-da334fe3e8d8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:51 crc kubenswrapper[4952]: I1122 03:46:51.975700 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b32e0459-7aee-4841-8281-da334fe3e8d8-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:51 crc kubenswrapper[4952]: I1122 03:46:51.975750 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b32e0459-7aee-4841-8281-da334fe3e8d8-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:51 crc kubenswrapper[4952]: I1122 03:46:51.975787 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b32e0459-7aee-4841-8281-da334fe3e8d8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:51 crc kubenswrapper[4952]: I1122 03:46:51.975819 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b32e0459-7aee-4841-8281-da334fe3e8d8-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:51 crc kubenswrapper[4952]: I1122 03:46:51.976001 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b32e0459-7aee-4841-8281-da334fe3e8d8-config-data\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.078647 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b32e0459-7aee-4841-8281-da334fe3e8d8-config-data\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.078778 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b32e0459-7aee-4841-8281-da334fe3e8d8-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.078825 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjpx4\" (UniqueName: \"kubernetes.io/projected/b32e0459-7aee-4841-8281-da334fe3e8d8-kube-api-access-kjpx4\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.078863 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.078915 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b32e0459-7aee-4841-8281-da334fe3e8d8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.079026 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b32e0459-7aee-4841-8281-da334fe3e8d8-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.079051 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b32e0459-7aee-4841-8281-da334fe3e8d8-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.079075 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b32e0459-7aee-4841-8281-da334fe3e8d8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.079101 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b32e0459-7aee-4841-8281-da334fe3e8d8-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.079882 4952 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.080322 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b32e0459-7aee-4841-8281-da334fe3e8d8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.080395 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b32e0459-7aee-4841-8281-da334fe3e8d8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.081154 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b32e0459-7aee-4841-8281-da334fe3e8d8-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.081879 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b32e0459-7aee-4841-8281-da334fe3e8d8-config-data\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.090862 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b32e0459-7aee-4841-8281-da334fe3e8d8-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.091139 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b32e0459-7aee-4841-8281-da334fe3e8d8-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.093017 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b32e0459-7aee-4841-8281-da334fe3e8d8-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.109049 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjpx4\" (UniqueName: \"kubernetes.io/projected/b32e0459-7aee-4841-8281-da334fe3e8d8-kube-api-access-kjpx4\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.125060 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.180621 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 22 03:46:52 crc kubenswrapper[4952]: I1122 03:46:52.620569 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 22 03:46:53 crc kubenswrapper[4952]: I1122 03:46:53.602425 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b32e0459-7aee-4841-8281-da334fe3e8d8","Type":"ContainerStarted","Data":"57c42742ad191227ab4ae8d29b85a2a7068f2524ff28dcf2a47fe6e7988a6809"} Nov 22 03:47:00 crc kubenswrapper[4952]: I1122 03:47:00.531796 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:47:01 crc kubenswrapper[4952]: I1122 03:47:01.675279 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"18f1ff23a633e6574051c2005b02b47415c04fcaae2a51a398e5ebafb57834d9"} Nov 22 03:47:24 crc kubenswrapper[4952]: E1122 03:47:24.690373 4952 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Nov 22 03:47:24 crc kubenswrapper[4952]: E1122 03:47:24.691744 4952 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kjpx4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(b32e0459-7aee-4841-8281-da334fe3e8d8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 03:47:24 crc kubenswrapper[4952]: E1122 03:47:24.693017 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="b32e0459-7aee-4841-8281-da334fe3e8d8" Nov 22 03:47:24 crc kubenswrapper[4952]: E1122 03:47:24.920374 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="b32e0459-7aee-4841-8281-da334fe3e8d8" Nov 22 03:47:39 crc kubenswrapper[4952]: I1122 03:47:39.020823 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 22 03:47:40 crc kubenswrapper[4952]: I1122 03:47:40.079830 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b32e0459-7aee-4841-8281-da334fe3e8d8","Type":"ContainerStarted","Data":"88b54f14f1febd929130bb4eeb738873694fb90cdd27e9698f0b9b0fe899a3bd"} Nov 22 03:47:40 crc kubenswrapper[4952]: I1122 03:47:40.119606 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.731619588 podStartE2EDuration="50.119580886s" podCreationTimestamp="2025-11-22 03:46:50 +0000 UTC" firstStartedPulling="2025-11-22 03:46:52.630297831 +0000 UTC m=+3176.936315104" lastFinishedPulling="2025-11-22 03:47:39.018259129 +0000 UTC m=+3223.324276402" observedRunningTime="2025-11-22 03:47:40.104313921 +0000 UTC m=+3224.410331204" watchObservedRunningTime="2025-11-22 03:47:40.119580886 +0000 UTC m=+3224.425598179" Nov 22 03:49:28 crc kubenswrapper[4952]: I1122 03:49:28.342584 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:49:28 crc kubenswrapper[4952]: I1122 03:49:28.343223 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:49:58 crc kubenswrapper[4952]: I1122 03:49:58.341817 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:49:58 crc kubenswrapper[4952]: I1122 03:49:58.342371 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:50:28 crc kubenswrapper[4952]: I1122 03:50:28.342193 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:50:28 crc kubenswrapper[4952]: I1122 03:50:28.342815 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:50:28 crc kubenswrapper[4952]: I1122 03:50:28.342876 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 03:50:28 crc kubenswrapper[4952]: I1122 03:50:28.343797 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"18f1ff23a633e6574051c2005b02b47415c04fcaae2a51a398e5ebafb57834d9"} pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:50:28 crc kubenswrapper[4952]: I1122 03:50:28.343929 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" containerID="cri-o://18f1ff23a633e6574051c2005b02b47415c04fcaae2a51a398e5ebafb57834d9" gracePeriod=600 Nov 22 03:50:28 crc kubenswrapper[4952]: I1122 03:50:28.791160 4952 generic.go:334] "Generic (PLEG): container finished" podID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerID="18f1ff23a633e6574051c2005b02b47415c04fcaae2a51a398e5ebafb57834d9" exitCode=0 Nov 22 03:50:28 crc kubenswrapper[4952]: I1122 03:50:28.791256 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerDied","Data":"18f1ff23a633e6574051c2005b02b47415c04fcaae2a51a398e5ebafb57834d9"} Nov 22 03:50:28 crc kubenswrapper[4952]: I1122 03:50:28.791506 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038"} Nov 22 03:50:28 crc kubenswrapper[4952]: I1122 03:50:28.791531 4952 scope.go:117] "RemoveContainer" containerID="f80f08fd01e9f3fbf3e5e6b1c9b51f296befa76b0ae2af2f9348b278c832a41d" Nov 22 03:51:03 crc kubenswrapper[4952]: I1122 03:51:03.014264 4952 scope.go:117] "RemoveContainer" containerID="c58e7f44b616beac7769a520d0b87ada38ab1bee9c2be1faa5493194fcc8545d" Nov 22 03:51:03 crc kubenswrapper[4952]: I1122 03:51:03.046991 4952 scope.go:117] "RemoveContainer" containerID="2a97d988198d26db45920fbb7e85c2d922603ee037ee761c94b76915d3d065db" Nov 22 03:51:03 crc kubenswrapper[4952]: I1122 03:51:03.101255 4952 scope.go:117] "RemoveContainer" containerID="3fd8ab46c5f1b3fc8d929b41d8370a61d1f7230bacb6773799f8a57573e328ff" Nov 22 03:52:04 crc kubenswrapper[4952]: I1122 03:52:04.231091 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fvd72"] Nov 22 03:52:04 crc kubenswrapper[4952]: I1122 03:52:04.236002 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fvd72" Nov 22 03:52:04 crc kubenswrapper[4952]: I1122 03:52:04.245289 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm4dr\" (UniqueName: \"kubernetes.io/projected/0cc65964-0df2-4042-ac76-cb4b7b684e23-kube-api-access-fm4dr\") pod \"redhat-operators-fvd72\" (UID: \"0cc65964-0df2-4042-ac76-cb4b7b684e23\") " pod="openshift-marketplace/redhat-operators-fvd72" Nov 22 03:52:04 crc kubenswrapper[4952]: I1122 03:52:04.245402 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc65964-0df2-4042-ac76-cb4b7b684e23-catalog-content\") pod \"redhat-operators-fvd72\" (UID: \"0cc65964-0df2-4042-ac76-cb4b7b684e23\") " pod="openshift-marketplace/redhat-operators-fvd72" Nov 22 03:52:04 crc kubenswrapper[4952]: I1122 03:52:04.245470 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc65964-0df2-4042-ac76-cb4b7b684e23-utilities\") pod \"redhat-operators-fvd72\" (UID: \"0cc65964-0df2-4042-ac76-cb4b7b684e23\") " pod="openshift-marketplace/redhat-operators-fvd72" Nov 22 03:52:04 crc kubenswrapper[4952]: I1122 03:52:04.277827 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fvd72"] Nov 22 03:52:04 crc kubenswrapper[4952]: I1122 03:52:04.347379 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm4dr\" (UniqueName: \"kubernetes.io/projected/0cc65964-0df2-4042-ac76-cb4b7b684e23-kube-api-access-fm4dr\") pod \"redhat-operators-fvd72\" (UID: \"0cc65964-0df2-4042-ac76-cb4b7b684e23\") " pod="openshift-marketplace/redhat-operators-fvd72" Nov 22 03:52:04 crc kubenswrapper[4952]: I1122 03:52:04.347438 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc65964-0df2-4042-ac76-cb4b7b684e23-catalog-content\") pod \"redhat-operators-fvd72\" (UID: \"0cc65964-0df2-4042-ac76-cb4b7b684e23\") " pod="openshift-marketplace/redhat-operators-fvd72" Nov 22 03:52:04 crc kubenswrapper[4952]: I1122 03:52:04.347473 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc65964-0df2-4042-ac76-cb4b7b684e23-utilities\") pod \"redhat-operators-fvd72\" (UID: \"0cc65964-0df2-4042-ac76-cb4b7b684e23\") " pod="openshift-marketplace/redhat-operators-fvd72" Nov 22 03:52:04 crc kubenswrapper[4952]: I1122 03:52:04.348024 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc65964-0df2-4042-ac76-cb4b7b684e23-utilities\") pod \"redhat-operators-fvd72\" (UID: \"0cc65964-0df2-4042-ac76-cb4b7b684e23\") " pod="openshift-marketplace/redhat-operators-fvd72" Nov 22 03:52:04 crc kubenswrapper[4952]: I1122 03:52:04.348513 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc65964-0df2-4042-ac76-cb4b7b684e23-catalog-content\") pod \"redhat-operators-fvd72\" (UID: \"0cc65964-0df2-4042-ac76-cb4b7b684e23\") " pod="openshift-marketplace/redhat-operators-fvd72" Nov 22 03:52:04 crc kubenswrapper[4952]: I1122 03:52:04.367267 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm4dr\" (UniqueName: \"kubernetes.io/projected/0cc65964-0df2-4042-ac76-cb4b7b684e23-kube-api-access-fm4dr\") pod \"redhat-operators-fvd72\" (UID: \"0cc65964-0df2-4042-ac76-cb4b7b684e23\") " pod="openshift-marketplace/redhat-operators-fvd72" Nov 22 03:52:04 crc kubenswrapper[4952]: I1122 03:52:04.569386 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fvd72" Nov 22 03:52:05 crc kubenswrapper[4952]: I1122 03:52:05.169740 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fvd72"] Nov 22 03:52:05 crc kubenswrapper[4952]: I1122 03:52:05.249936 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fvd72" event={"ID":"0cc65964-0df2-4042-ac76-cb4b7b684e23","Type":"ContainerStarted","Data":"7e134e4c177079a0f0bc33876af81a993118d265361b137b76a705f12ed8f7b2"} Nov 22 03:52:06 crc kubenswrapper[4952]: I1122 03:52:06.264072 4952 generic.go:334] "Generic (PLEG): container finished" podID="0cc65964-0df2-4042-ac76-cb4b7b684e23" containerID="adeb1982f8d56f42fe806f5ec833350d96086c8cad906e0d46370c4863ed4f10" exitCode=0 Nov 22 03:52:06 crc kubenswrapper[4952]: I1122 03:52:06.264184 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fvd72" event={"ID":"0cc65964-0df2-4042-ac76-cb4b7b684e23","Type":"ContainerDied","Data":"adeb1982f8d56f42fe806f5ec833350d96086c8cad906e0d46370c4863ed4f10"} Nov 22 03:52:06 crc kubenswrapper[4952]: I1122 03:52:06.267475 4952 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 03:52:08 crc kubenswrapper[4952]: I1122 03:52:08.285604 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fvd72" event={"ID":"0cc65964-0df2-4042-ac76-cb4b7b684e23","Type":"ContainerStarted","Data":"cd3643eea7fe2fe86450bc599f3ac22d3967b9b1da055d147a7bbdb086856d89"} Nov 22 03:52:09 crc kubenswrapper[4952]: I1122 03:52:09.294571 4952 generic.go:334] "Generic (PLEG): container finished" podID="0cc65964-0df2-4042-ac76-cb4b7b684e23" containerID="cd3643eea7fe2fe86450bc599f3ac22d3967b9b1da055d147a7bbdb086856d89" exitCode=0 Nov 22 03:52:09 crc kubenswrapper[4952]: I1122 03:52:09.294648 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fvd72" event={"ID":"0cc65964-0df2-4042-ac76-cb4b7b684e23","Type":"ContainerDied","Data":"cd3643eea7fe2fe86450bc599f3ac22d3967b9b1da055d147a7bbdb086856d89"} Nov 22 03:52:11 crc kubenswrapper[4952]: I1122 03:52:11.315083 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fvd72" event={"ID":"0cc65964-0df2-4042-ac76-cb4b7b684e23","Type":"ContainerStarted","Data":"6de5e0abb19694d4d90d44dc906a514fa45decddd9081dc437aec0592acec788"} Nov 22 03:52:11 crc kubenswrapper[4952]: I1122 03:52:11.337040 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fvd72" podStartSLOduration=3.656085459 podStartE2EDuration="7.337021971s" podCreationTimestamp="2025-11-22 03:52:04 +0000 UTC" firstStartedPulling="2025-11-22 03:52:06.267073544 +0000 UTC m=+3490.573090847" lastFinishedPulling="2025-11-22 03:52:09.948010046 +0000 UTC m=+3494.254027359" observedRunningTime="2025-11-22 03:52:11.333202279 +0000 UTC m=+3495.639219622" watchObservedRunningTime="2025-11-22 03:52:11.337021971 +0000 UTC m=+3495.643039244" Nov 22 03:52:14 crc kubenswrapper[4952]: I1122 03:52:14.570363 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fvd72" Nov 22 03:52:14 crc kubenswrapper[4952]: I1122 03:52:14.571886 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fvd72" Nov 22 03:52:15 crc kubenswrapper[4952]: I1122 03:52:15.615536 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fvd72" podUID="0cc65964-0df2-4042-ac76-cb4b7b684e23" containerName="registry-server" probeResult="failure" output=< Nov 22 03:52:15 crc kubenswrapper[4952]: timeout: failed to connect service ":50051" within 1s Nov 22 03:52:15 crc kubenswrapper[4952]: > Nov 22 03:52:25 crc kubenswrapper[4952]: I1122 03:52:25.640039 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fvd72" podUID="0cc65964-0df2-4042-ac76-cb4b7b684e23" containerName="registry-server" probeResult="failure" output=< Nov 22 03:52:25 crc kubenswrapper[4952]: timeout: failed to connect service ":50051" within 1s Nov 22 03:52:25 crc kubenswrapper[4952]: > Nov 22 03:52:28 crc kubenswrapper[4952]: I1122 03:52:28.341647 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:52:28 crc kubenswrapper[4952]: I1122 03:52:28.342016 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:52:34 crc kubenswrapper[4952]: I1122 03:52:34.631800 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fvd72" Nov 22 03:52:34 crc kubenswrapper[4952]: I1122 03:52:34.713127 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fvd72" Nov 22 03:52:35 crc kubenswrapper[4952]: I1122 03:52:35.432400 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fvd72"] Nov 22 03:52:36 crc kubenswrapper[4952]: I1122 03:52:36.629644 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fvd72" podUID="0cc65964-0df2-4042-ac76-cb4b7b684e23" containerName="registry-server" containerID="cri-o://6de5e0abb19694d4d90d44dc906a514fa45decddd9081dc437aec0592acec788" gracePeriod=2 Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.324893 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fvd72" Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.467047 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc65964-0df2-4042-ac76-cb4b7b684e23-utilities\") pod \"0cc65964-0df2-4042-ac76-cb4b7b684e23\" (UID: \"0cc65964-0df2-4042-ac76-cb4b7b684e23\") " Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.467101 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm4dr\" (UniqueName: \"kubernetes.io/projected/0cc65964-0df2-4042-ac76-cb4b7b684e23-kube-api-access-fm4dr\") pod \"0cc65964-0df2-4042-ac76-cb4b7b684e23\" (UID: \"0cc65964-0df2-4042-ac76-cb4b7b684e23\") " Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.467353 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc65964-0df2-4042-ac76-cb4b7b684e23-catalog-content\") pod \"0cc65964-0df2-4042-ac76-cb4b7b684e23\" (UID: \"0cc65964-0df2-4042-ac76-cb4b7b684e23\") " Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.468060 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cc65964-0df2-4042-ac76-cb4b7b684e23-utilities" (OuterVolumeSpecName: "utilities") pod "0cc65964-0df2-4042-ac76-cb4b7b684e23" (UID: "0cc65964-0df2-4042-ac76-cb4b7b684e23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.489905 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cc65964-0df2-4042-ac76-cb4b7b684e23-kube-api-access-fm4dr" (OuterVolumeSpecName: "kube-api-access-fm4dr") pod "0cc65964-0df2-4042-ac76-cb4b7b684e23" (UID: "0cc65964-0df2-4042-ac76-cb4b7b684e23"). InnerVolumeSpecName "kube-api-access-fm4dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.563796 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cc65964-0df2-4042-ac76-cb4b7b684e23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cc65964-0df2-4042-ac76-cb4b7b684e23" (UID: "0cc65964-0df2-4042-ac76-cb4b7b684e23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.571709 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc65964-0df2-4042-ac76-cb4b7b684e23-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.571795 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fm4dr\" (UniqueName: \"kubernetes.io/projected/0cc65964-0df2-4042-ac76-cb4b7b684e23-kube-api-access-fm4dr\") on node \"crc\" DevicePath \"\"" Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.571813 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc65964-0df2-4042-ac76-cb4b7b684e23-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.644031 4952 generic.go:334] "Generic (PLEG): container finished" podID="0cc65964-0df2-4042-ac76-cb4b7b684e23" containerID="6de5e0abb19694d4d90d44dc906a514fa45decddd9081dc437aec0592acec788" exitCode=0 Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.644067 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fvd72" event={"ID":"0cc65964-0df2-4042-ac76-cb4b7b684e23","Type":"ContainerDied","Data":"6de5e0abb19694d4d90d44dc906a514fa45decddd9081dc437aec0592acec788"} Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.644094 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fvd72" event={"ID":"0cc65964-0df2-4042-ac76-cb4b7b684e23","Type":"ContainerDied","Data":"7e134e4c177079a0f0bc33876af81a993118d265361b137b76a705f12ed8f7b2"} Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.644117 4952 scope.go:117] "RemoveContainer" containerID="6de5e0abb19694d4d90d44dc906a514fa45decddd9081dc437aec0592acec788" Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.644308 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fvd72" Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.685754 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fvd72"] Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.685908 4952 scope.go:117] "RemoveContainer" containerID="cd3643eea7fe2fe86450bc599f3ac22d3967b9b1da055d147a7bbdb086856d89" Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.690504 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fvd72"] Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.712467 4952 scope.go:117] "RemoveContainer" containerID="adeb1982f8d56f42fe806f5ec833350d96086c8cad906e0d46370c4863ed4f10" Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.755716 4952 scope.go:117] "RemoveContainer" containerID="6de5e0abb19694d4d90d44dc906a514fa45decddd9081dc437aec0592acec788" Nov 22 03:52:38 crc kubenswrapper[4952]: E1122 03:52:37.756331 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6de5e0abb19694d4d90d44dc906a514fa45decddd9081dc437aec0592acec788\": container with ID starting with 6de5e0abb19694d4d90d44dc906a514fa45decddd9081dc437aec0592acec788 not found: ID does not exist" containerID="6de5e0abb19694d4d90d44dc906a514fa45decddd9081dc437aec0592acec788" Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.756446 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de5e0abb19694d4d90d44dc906a514fa45decddd9081dc437aec0592acec788"} err="failed to get container status \"6de5e0abb19694d4d90d44dc906a514fa45decddd9081dc437aec0592acec788\": rpc error: code = NotFound desc = could not find container \"6de5e0abb19694d4d90d44dc906a514fa45decddd9081dc437aec0592acec788\": container with ID starting with 6de5e0abb19694d4d90d44dc906a514fa45decddd9081dc437aec0592acec788 not found: ID does not exist" Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.756506 4952 scope.go:117] "RemoveContainer" containerID="cd3643eea7fe2fe86450bc599f3ac22d3967b9b1da055d147a7bbdb086856d89" Nov 22 03:52:38 crc kubenswrapper[4952]: E1122 03:52:37.760188 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd3643eea7fe2fe86450bc599f3ac22d3967b9b1da055d147a7bbdb086856d89\": container with ID starting with cd3643eea7fe2fe86450bc599f3ac22d3967b9b1da055d147a7bbdb086856d89 not found: ID does not exist" containerID="cd3643eea7fe2fe86450bc599f3ac22d3967b9b1da055d147a7bbdb086856d89" Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.760256 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd3643eea7fe2fe86450bc599f3ac22d3967b9b1da055d147a7bbdb086856d89"} err="failed to get container status \"cd3643eea7fe2fe86450bc599f3ac22d3967b9b1da055d147a7bbdb086856d89\": rpc error: code = NotFound desc = could not find container \"cd3643eea7fe2fe86450bc599f3ac22d3967b9b1da055d147a7bbdb086856d89\": container with ID starting with cd3643eea7fe2fe86450bc599f3ac22d3967b9b1da055d147a7bbdb086856d89 not found: ID does not exist" Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.760291 4952 scope.go:117] "RemoveContainer" containerID="adeb1982f8d56f42fe806f5ec833350d96086c8cad906e0d46370c4863ed4f10" Nov 22 03:52:38 crc kubenswrapper[4952]: E1122 03:52:37.760794 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adeb1982f8d56f42fe806f5ec833350d96086c8cad906e0d46370c4863ed4f10\": container with ID starting with adeb1982f8d56f42fe806f5ec833350d96086c8cad906e0d46370c4863ed4f10 not found: ID does not exist" containerID="adeb1982f8d56f42fe806f5ec833350d96086c8cad906e0d46370c4863ed4f10" Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:37.760834 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adeb1982f8d56f42fe806f5ec833350d96086c8cad906e0d46370c4863ed4f10"} err="failed to get container status \"adeb1982f8d56f42fe806f5ec833350d96086c8cad906e0d46370c4863ed4f10\": rpc error: code = NotFound desc = could not find container \"adeb1982f8d56f42fe806f5ec833350d96086c8cad906e0d46370c4863ed4f10\": container with ID starting with adeb1982f8d56f42fe806f5ec833350d96086c8cad906e0d46370c4863ed4f10 not found: ID does not exist" Nov 22 03:52:38 crc kubenswrapper[4952]: I1122 03:52:38.545270 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cc65964-0df2-4042-ac76-cb4b7b684e23" path="/var/lib/kubelet/pods/0cc65964-0df2-4042-ac76-cb4b7b684e23/volumes" Nov 22 03:52:58 crc kubenswrapper[4952]: I1122 03:52:58.342470 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:52:58 crc kubenswrapper[4952]: I1122 03:52:58.344786 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:53:28 crc kubenswrapper[4952]: I1122 03:53:28.342748 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:53:28 crc kubenswrapper[4952]: I1122 03:53:28.343612 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:53:28 crc kubenswrapper[4952]: I1122 03:53:28.343701 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 03:53:28 crc kubenswrapper[4952]: I1122 03:53:28.345076 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038"} pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:53:28 crc kubenswrapper[4952]: I1122 03:53:28.345184 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" containerID="cri-o://b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" gracePeriod=600 Nov 22 03:53:28 crc kubenswrapper[4952]: E1122 03:53:28.474808 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:53:29 crc kubenswrapper[4952]: I1122 03:53:29.232058 4952 generic.go:334] "Generic (PLEG): container finished" podID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" exitCode=0 Nov 22 03:53:29 crc kubenswrapper[4952]: I1122 03:53:29.232115 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerDied","Data":"b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038"} Nov 22 03:53:29 crc kubenswrapper[4952]: I1122 03:53:29.232165 4952 scope.go:117] "RemoveContainer" containerID="18f1ff23a633e6574051c2005b02b47415c04fcaae2a51a398e5ebafb57834d9" Nov 22 03:53:29 crc kubenswrapper[4952]: I1122 03:53:29.232774 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:53:29 crc kubenswrapper[4952]: E1122 03:53:29.233012 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:53:41 crc kubenswrapper[4952]: I1122 03:53:41.531954 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:53:41 crc kubenswrapper[4952]: E1122 03:53:41.533126 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:53:53 crc kubenswrapper[4952]: I1122 03:53:53.531636 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:53:53 crc kubenswrapper[4952]: E1122 03:53:53.532458 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:54:04 crc kubenswrapper[4952]: I1122 03:54:04.532109 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:54:04 crc kubenswrapper[4952]: E1122 03:54:04.532791 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:54:18 crc kubenswrapper[4952]: I1122 03:54:18.531050 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:54:18 crc kubenswrapper[4952]: E1122 03:54:18.531982 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:54:30 crc kubenswrapper[4952]: I1122 03:54:30.531799 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:54:30 crc kubenswrapper[4952]: E1122 03:54:30.532578 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:54:42 crc kubenswrapper[4952]: I1122 03:54:42.058966 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-x9v6c"] Nov 22 03:54:42 crc kubenswrapper[4952]: I1122 03:54:42.070032 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-x9v6c"] Nov 22 03:54:42 crc kubenswrapper[4952]: I1122 03:54:42.083136 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-f5d0-account-create-2p542"] Nov 22 03:54:42 crc kubenswrapper[4952]: I1122 03:54:42.101493 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-f5d0-account-create-2p542"] Nov 22 03:54:42 crc kubenswrapper[4952]: I1122 03:54:42.543236 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aab2b754-fb45-442c-8efd-feca454390f3" path="/var/lib/kubelet/pods/aab2b754-fb45-442c-8efd-feca454390f3/volumes" Nov 22 03:54:42 crc kubenswrapper[4952]: I1122 03:54:42.543840 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b133bfea-bd18-486b-8ab0-12bba9c84fe6" path="/var/lib/kubelet/pods/b133bfea-bd18-486b-8ab0-12bba9c84fe6/volumes" Nov 22 03:54:43 crc kubenswrapper[4952]: I1122 03:54:43.531617 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:54:43 crc kubenswrapper[4952]: E1122 03:54:43.532124 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:54:58 crc kubenswrapper[4952]: I1122 03:54:58.531582 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:54:58 crc kubenswrapper[4952]: E1122 03:54:58.532243 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:55:03 crc kubenswrapper[4952]: I1122 03:55:03.271266 4952 scope.go:117] "RemoveContainer" containerID="89feb0c52b8664344ac195a1d423b135377ff8bb6d3372a9d3ded78920987224" Nov 22 03:55:03 crc kubenswrapper[4952]: I1122 03:55:03.320337 4952 scope.go:117] "RemoveContainer" containerID="91121fe4e73b57946e9dd61ef4c9639fee20c38f9d4fef72738fd1645a7fed06" Nov 22 03:55:11 crc kubenswrapper[4952]: I1122 03:55:11.531342 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:55:11 crc kubenswrapper[4952]: E1122 03:55:11.532450 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:55:13 crc kubenswrapper[4952]: I1122 03:55:13.058992 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-krnxz"] Nov 22 03:55:13 crc kubenswrapper[4952]: I1122 03:55:13.070764 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-krnxz"] Nov 22 03:55:14 crc kubenswrapper[4952]: I1122 03:55:14.546495 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="701fbc88-a1f6-493d-b6cf-3397073a8d9d" path="/var/lib/kubelet/pods/701fbc88-a1f6-493d-b6cf-3397073a8d9d/volumes" Nov 22 03:55:24 crc kubenswrapper[4952]: I1122 03:55:24.531307 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:55:24 crc kubenswrapper[4952]: E1122 03:55:24.534068 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:55:35 crc kubenswrapper[4952]: I1122 03:55:35.531619 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:55:35 crc kubenswrapper[4952]: E1122 03:55:35.532585 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:55:35 crc kubenswrapper[4952]: I1122 03:55:35.744923 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4c8xz"] Nov 22 03:55:35 crc kubenswrapper[4952]: E1122 03:55:35.746596 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc65964-0df2-4042-ac76-cb4b7b684e23" containerName="extract-content" Nov 22 03:55:35 crc kubenswrapper[4952]: I1122 03:55:35.746624 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc65964-0df2-4042-ac76-cb4b7b684e23" containerName="extract-content" Nov 22 03:55:35 crc kubenswrapper[4952]: E1122 03:55:35.746704 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc65964-0df2-4042-ac76-cb4b7b684e23" containerName="registry-server" Nov 22 03:55:35 crc kubenswrapper[4952]: I1122 03:55:35.746734 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc65964-0df2-4042-ac76-cb4b7b684e23" containerName="registry-server" Nov 22 03:55:35 crc kubenswrapper[4952]: E1122 03:55:35.746767 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc65964-0df2-4042-ac76-cb4b7b684e23" containerName="extract-utilities" Nov 22 03:55:35 crc kubenswrapper[4952]: I1122 03:55:35.746841 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc65964-0df2-4042-ac76-cb4b7b684e23" containerName="extract-utilities" Nov 22 03:55:35 crc kubenswrapper[4952]: I1122 03:55:35.747343 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc65964-0df2-4042-ac76-cb4b7b684e23" containerName="registry-server" Nov 22 03:55:35 crc kubenswrapper[4952]: I1122 03:55:35.755796 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4c8xz" Nov 22 03:55:35 crc kubenswrapper[4952]: I1122 03:55:35.761390 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4c8xz"] Nov 22 03:55:35 crc kubenswrapper[4952]: I1122 03:55:35.917182 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4e7415a-9606-4a1d-b54f-2ff1d130bc27-utilities\") pod \"community-operators-4c8xz\" (UID: \"d4e7415a-9606-4a1d-b54f-2ff1d130bc27\") " pod="openshift-marketplace/community-operators-4c8xz" Nov 22 03:55:35 crc kubenswrapper[4952]: I1122 03:55:35.917585 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2tbf\" (UniqueName: \"kubernetes.io/projected/d4e7415a-9606-4a1d-b54f-2ff1d130bc27-kube-api-access-m2tbf\") pod \"community-operators-4c8xz\" (UID: \"d4e7415a-9606-4a1d-b54f-2ff1d130bc27\") " pod="openshift-marketplace/community-operators-4c8xz" Nov 22 03:55:35 crc kubenswrapper[4952]: I1122 03:55:35.917739 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4e7415a-9606-4a1d-b54f-2ff1d130bc27-catalog-content\") pod \"community-operators-4c8xz\" (UID: \"d4e7415a-9606-4a1d-b54f-2ff1d130bc27\") " pod="openshift-marketplace/community-operators-4c8xz" Nov 22 03:55:36 crc kubenswrapper[4952]: I1122 03:55:36.020084 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4e7415a-9606-4a1d-b54f-2ff1d130bc27-utilities\") pod \"community-operators-4c8xz\" (UID: \"d4e7415a-9606-4a1d-b54f-2ff1d130bc27\") " pod="openshift-marketplace/community-operators-4c8xz" Nov 22 03:55:36 crc kubenswrapper[4952]: I1122 03:55:36.020156 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2tbf\" (UniqueName: \"kubernetes.io/projected/d4e7415a-9606-4a1d-b54f-2ff1d130bc27-kube-api-access-m2tbf\") pod \"community-operators-4c8xz\" (UID: \"d4e7415a-9606-4a1d-b54f-2ff1d130bc27\") " pod="openshift-marketplace/community-operators-4c8xz" Nov 22 03:55:36 crc kubenswrapper[4952]: I1122 03:55:36.020534 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4e7415a-9606-4a1d-b54f-2ff1d130bc27-utilities\") pod \"community-operators-4c8xz\" (UID: \"d4e7415a-9606-4a1d-b54f-2ff1d130bc27\") " pod="openshift-marketplace/community-operators-4c8xz" Nov 22 03:55:36 crc kubenswrapper[4952]: I1122 03:55:36.021047 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4e7415a-9606-4a1d-b54f-2ff1d130bc27-catalog-content\") pod \"community-operators-4c8xz\" (UID: \"d4e7415a-9606-4a1d-b54f-2ff1d130bc27\") " pod="openshift-marketplace/community-operators-4c8xz" Nov 22 03:55:36 crc kubenswrapper[4952]: I1122 03:55:36.021613 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4e7415a-9606-4a1d-b54f-2ff1d130bc27-catalog-content\") pod \"community-operators-4c8xz\" (UID: \"d4e7415a-9606-4a1d-b54f-2ff1d130bc27\") " pod="openshift-marketplace/community-operators-4c8xz" Nov 22 03:55:36 crc kubenswrapper[4952]: I1122 03:55:36.046803 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2tbf\" (UniqueName: \"kubernetes.io/projected/d4e7415a-9606-4a1d-b54f-2ff1d130bc27-kube-api-access-m2tbf\") pod \"community-operators-4c8xz\" (UID: \"d4e7415a-9606-4a1d-b54f-2ff1d130bc27\") " pod="openshift-marketplace/community-operators-4c8xz" Nov 22 03:55:36 crc kubenswrapper[4952]: I1122 03:55:36.082698 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4c8xz" Nov 22 03:55:36 crc kubenswrapper[4952]: I1122 03:55:36.639829 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4c8xz"] Nov 22 03:55:37 crc kubenswrapper[4952]: I1122 03:55:37.504526 4952 generic.go:334] "Generic (PLEG): container finished" podID="d4e7415a-9606-4a1d-b54f-2ff1d130bc27" containerID="5e3015d4d66e326ecc44d6d1af115dc077c545fa2a7f3474f52277ae837b0ad8" exitCode=0 Nov 22 03:55:37 crc kubenswrapper[4952]: I1122 03:55:37.504712 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4c8xz" event={"ID":"d4e7415a-9606-4a1d-b54f-2ff1d130bc27","Type":"ContainerDied","Data":"5e3015d4d66e326ecc44d6d1af115dc077c545fa2a7f3474f52277ae837b0ad8"} Nov 22 03:55:37 crc kubenswrapper[4952]: I1122 03:55:37.505321 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4c8xz" event={"ID":"d4e7415a-9606-4a1d-b54f-2ff1d130bc27","Type":"ContainerStarted","Data":"f2586cb16d5b474d7159d395c7beaa8e3c3b4e3e8e3786bb651280083ab7500a"} Nov 22 03:55:39 crc kubenswrapper[4952]: I1122 03:55:39.526812 4952 generic.go:334] "Generic (PLEG): container finished" podID="d4e7415a-9606-4a1d-b54f-2ff1d130bc27" containerID="61b6acc476da2942683e310e3937ae14560eddffa97abab2d60a8424fa2bb9b3" exitCode=0 Nov 22 03:55:39 crc kubenswrapper[4952]: I1122 03:55:39.526865 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4c8xz" event={"ID":"d4e7415a-9606-4a1d-b54f-2ff1d130bc27","Type":"ContainerDied","Data":"61b6acc476da2942683e310e3937ae14560eddffa97abab2d60a8424fa2bb9b3"} Nov 22 03:55:40 crc kubenswrapper[4952]: I1122 03:55:40.544772 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4c8xz" event={"ID":"d4e7415a-9606-4a1d-b54f-2ff1d130bc27","Type":"ContainerStarted","Data":"7dc44e28b8ae84cc9baa9b5aaf08c6de9a0a40fad3d984d08b01843246b1c3f0"} Nov 22 03:55:40 crc kubenswrapper[4952]: I1122 03:55:40.565435 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4c8xz" podStartSLOduration=3.057172125 podStartE2EDuration="5.56541562s" podCreationTimestamp="2025-11-22 03:55:35 +0000 UTC" firstStartedPulling="2025-11-22 03:55:37.508933003 +0000 UTC m=+3701.814950316" lastFinishedPulling="2025-11-22 03:55:40.017176538 +0000 UTC m=+3704.323193811" observedRunningTime="2025-11-22 03:55:40.559705058 +0000 UTC m=+3704.865722351" watchObservedRunningTime="2025-11-22 03:55:40.56541562 +0000 UTC m=+3704.871432903" Nov 22 03:55:46 crc kubenswrapper[4952]: I1122 03:55:46.083506 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4c8xz" Nov 22 03:55:46 crc kubenswrapper[4952]: I1122 03:55:46.084195 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4c8xz" Nov 22 03:55:46 crc kubenswrapper[4952]: I1122 03:55:46.167758 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4c8xz" Nov 22 03:55:46 crc kubenswrapper[4952]: I1122 03:55:46.663946 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4c8xz" Nov 22 03:55:46 crc kubenswrapper[4952]: I1122 03:55:46.715312 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4c8xz"] Nov 22 03:55:48 crc kubenswrapper[4952]: I1122 03:55:48.616366 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4c8xz" podUID="d4e7415a-9606-4a1d-b54f-2ff1d130bc27" containerName="registry-server" containerID="cri-o://7dc44e28b8ae84cc9baa9b5aaf08c6de9a0a40fad3d984d08b01843246b1c3f0" gracePeriod=2 Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.227865 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4c8xz" Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.386443 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2tbf\" (UniqueName: \"kubernetes.io/projected/d4e7415a-9606-4a1d-b54f-2ff1d130bc27-kube-api-access-m2tbf\") pod \"d4e7415a-9606-4a1d-b54f-2ff1d130bc27\" (UID: \"d4e7415a-9606-4a1d-b54f-2ff1d130bc27\") " Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.386535 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4e7415a-9606-4a1d-b54f-2ff1d130bc27-utilities\") pod \"d4e7415a-9606-4a1d-b54f-2ff1d130bc27\" (UID: \"d4e7415a-9606-4a1d-b54f-2ff1d130bc27\") " Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.386663 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4e7415a-9606-4a1d-b54f-2ff1d130bc27-catalog-content\") pod \"d4e7415a-9606-4a1d-b54f-2ff1d130bc27\" (UID: \"d4e7415a-9606-4a1d-b54f-2ff1d130bc27\") " Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.390275 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4e7415a-9606-4a1d-b54f-2ff1d130bc27-utilities" (OuterVolumeSpecName: "utilities") pod "d4e7415a-9606-4a1d-b54f-2ff1d130bc27" (UID: "d4e7415a-9606-4a1d-b54f-2ff1d130bc27"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.409352 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4e7415a-9606-4a1d-b54f-2ff1d130bc27-kube-api-access-m2tbf" (OuterVolumeSpecName: "kube-api-access-m2tbf") pod "d4e7415a-9606-4a1d-b54f-2ff1d130bc27" (UID: "d4e7415a-9606-4a1d-b54f-2ff1d130bc27"). InnerVolumeSpecName "kube-api-access-m2tbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.489397 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2tbf\" (UniqueName: \"kubernetes.io/projected/d4e7415a-9606-4a1d-b54f-2ff1d130bc27-kube-api-access-m2tbf\") on node \"crc\" DevicePath \"\"" Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.489668 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4e7415a-9606-4a1d-b54f-2ff1d130bc27-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.634741 4952 generic.go:334] "Generic (PLEG): container finished" podID="d4e7415a-9606-4a1d-b54f-2ff1d130bc27" containerID="7dc44e28b8ae84cc9baa9b5aaf08c6de9a0a40fad3d984d08b01843246b1c3f0" exitCode=0 Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.634811 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4c8xz" event={"ID":"d4e7415a-9606-4a1d-b54f-2ff1d130bc27","Type":"ContainerDied","Data":"7dc44e28b8ae84cc9baa9b5aaf08c6de9a0a40fad3d984d08b01843246b1c3f0"} Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.634869 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4c8xz" event={"ID":"d4e7415a-9606-4a1d-b54f-2ff1d130bc27","Type":"ContainerDied","Data":"f2586cb16d5b474d7159d395c7beaa8e3c3b4e3e8e3786bb651280083ab7500a"} Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.634885 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4c8xz" Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.634898 4952 scope.go:117] "RemoveContainer" containerID="7dc44e28b8ae84cc9baa9b5aaf08c6de9a0a40fad3d984d08b01843246b1c3f0" Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.673135 4952 scope.go:117] "RemoveContainer" containerID="61b6acc476da2942683e310e3937ae14560eddffa97abab2d60a8424fa2bb9b3" Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.706882 4952 scope.go:117] "RemoveContainer" containerID="5e3015d4d66e326ecc44d6d1af115dc077c545fa2a7f3474f52277ae837b0ad8" Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.751865 4952 scope.go:117] "RemoveContainer" containerID="7dc44e28b8ae84cc9baa9b5aaf08c6de9a0a40fad3d984d08b01843246b1c3f0" Nov 22 03:55:49 crc kubenswrapper[4952]: E1122 03:55:49.752361 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dc44e28b8ae84cc9baa9b5aaf08c6de9a0a40fad3d984d08b01843246b1c3f0\": container with ID starting with 7dc44e28b8ae84cc9baa9b5aaf08c6de9a0a40fad3d984d08b01843246b1c3f0 not found: ID does not exist" containerID="7dc44e28b8ae84cc9baa9b5aaf08c6de9a0a40fad3d984d08b01843246b1c3f0" Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.752410 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dc44e28b8ae84cc9baa9b5aaf08c6de9a0a40fad3d984d08b01843246b1c3f0"} err="failed to get container status \"7dc44e28b8ae84cc9baa9b5aaf08c6de9a0a40fad3d984d08b01843246b1c3f0\": rpc error: code = NotFound desc = could not find container \"7dc44e28b8ae84cc9baa9b5aaf08c6de9a0a40fad3d984d08b01843246b1c3f0\": container with ID starting with 7dc44e28b8ae84cc9baa9b5aaf08c6de9a0a40fad3d984d08b01843246b1c3f0 not found: ID does not exist" Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.752444 4952 scope.go:117] "RemoveContainer" containerID="61b6acc476da2942683e310e3937ae14560eddffa97abab2d60a8424fa2bb9b3" Nov 22 03:55:49 crc kubenswrapper[4952]: E1122 03:55:49.752780 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61b6acc476da2942683e310e3937ae14560eddffa97abab2d60a8424fa2bb9b3\": container with ID starting with 61b6acc476da2942683e310e3937ae14560eddffa97abab2d60a8424fa2bb9b3 not found: ID does not exist" containerID="61b6acc476da2942683e310e3937ae14560eddffa97abab2d60a8424fa2bb9b3" Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.752815 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61b6acc476da2942683e310e3937ae14560eddffa97abab2d60a8424fa2bb9b3"} err="failed to get container status \"61b6acc476da2942683e310e3937ae14560eddffa97abab2d60a8424fa2bb9b3\": rpc error: code = NotFound desc = could not find container \"61b6acc476da2942683e310e3937ae14560eddffa97abab2d60a8424fa2bb9b3\": container with ID starting with 61b6acc476da2942683e310e3937ae14560eddffa97abab2d60a8424fa2bb9b3 not found: ID does not exist" Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.752838 4952 scope.go:117] "RemoveContainer" containerID="5e3015d4d66e326ecc44d6d1af115dc077c545fa2a7f3474f52277ae837b0ad8" Nov 22 03:55:49 crc kubenswrapper[4952]: E1122 03:55:49.753305 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e3015d4d66e326ecc44d6d1af115dc077c545fa2a7f3474f52277ae837b0ad8\": container with ID starting with 5e3015d4d66e326ecc44d6d1af115dc077c545fa2a7f3474f52277ae837b0ad8 not found: ID does not exist" containerID="5e3015d4d66e326ecc44d6d1af115dc077c545fa2a7f3474f52277ae837b0ad8" Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.753342 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3015d4d66e326ecc44d6d1af115dc077c545fa2a7f3474f52277ae837b0ad8"} err="failed to get container status \"5e3015d4d66e326ecc44d6d1af115dc077c545fa2a7f3474f52277ae837b0ad8\": rpc error: code = NotFound desc = could not find container \"5e3015d4d66e326ecc44d6d1af115dc077c545fa2a7f3474f52277ae837b0ad8\": container with ID starting with 5e3015d4d66e326ecc44d6d1af115dc077c545fa2a7f3474f52277ae837b0ad8 not found: ID does not exist" Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.910222 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4e7415a-9606-4a1d-b54f-2ff1d130bc27-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4e7415a-9606-4a1d-b54f-2ff1d130bc27" (UID: "d4e7415a-9606-4a1d-b54f-2ff1d130bc27"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.966207 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4c8xz"] Nov 22 03:55:49 crc kubenswrapper[4952]: I1122 03:55:49.972862 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4c8xz"] Nov 22 03:55:50 crc kubenswrapper[4952]: I1122 03:55:50.000790 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4e7415a-9606-4a1d-b54f-2ff1d130bc27-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:55:50 crc kubenswrapper[4952]: I1122 03:55:50.531511 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:55:50 crc kubenswrapper[4952]: E1122 03:55:50.532165 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:55:50 crc kubenswrapper[4952]: I1122 03:55:50.546616 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4e7415a-9606-4a1d-b54f-2ff1d130bc27" path="/var/lib/kubelet/pods/d4e7415a-9606-4a1d-b54f-2ff1d130bc27/volumes" Nov 22 03:56:01 crc kubenswrapper[4952]: I1122 03:56:01.531254 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:56:01 crc kubenswrapper[4952]: E1122 03:56:01.532392 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:56:03 crc kubenswrapper[4952]: I1122 03:56:03.459462 4952 scope.go:117] "RemoveContainer" containerID="d1c3fbba899232dfe942c19bcb5664ac1ec804e3b21d27dd8b09313b440ba735" Nov 22 03:56:15 crc kubenswrapper[4952]: I1122 03:56:15.531591 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:56:15 crc kubenswrapper[4952]: E1122 03:56:15.532887 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:56:27 crc kubenswrapper[4952]: I1122 03:56:27.530861 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:56:27 crc kubenswrapper[4952]: E1122 03:56:27.531715 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:56:40 crc kubenswrapper[4952]: I1122 03:56:40.531207 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:56:40 crc kubenswrapper[4952]: E1122 03:56:40.532266 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:56:55 crc kubenswrapper[4952]: I1122 03:56:55.531803 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:56:55 crc kubenswrapper[4952]: E1122 03:56:55.532716 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:57:09 crc kubenswrapper[4952]: I1122 03:57:09.531720 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:57:09 crc kubenswrapper[4952]: E1122 03:57:09.551928 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:57:23 crc kubenswrapper[4952]: I1122 03:57:23.531837 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:57:23 crc kubenswrapper[4952]: E1122 03:57:23.532745 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:57:36 crc kubenswrapper[4952]: I1122 03:57:36.537978 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:57:36 crc kubenswrapper[4952]: E1122 03:57:36.538834 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:57:47 crc kubenswrapper[4952]: I1122 03:57:47.531092 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:57:47 crc kubenswrapper[4952]: E1122 03:57:47.532161 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:57:59 crc kubenswrapper[4952]: I1122 03:57:59.531012 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:57:59 crc kubenswrapper[4952]: E1122 03:57:59.531936 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:58:12 crc kubenswrapper[4952]: I1122 03:58:12.531534 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:58:12 crc kubenswrapper[4952]: E1122 03:58:12.532378 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:58:24 crc kubenswrapper[4952]: I1122 03:58:24.531823 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:58:24 crc kubenswrapper[4952]: E1122 03:58:24.532745 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 03:58:39 crc kubenswrapper[4952]: I1122 03:58:39.532242 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 03:58:40 crc kubenswrapper[4952]: I1122 03:58:40.547929 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"93216e93c086c713d83ad7c832be17e827e54ceb3d8d0bde6d6279853a9d4a00"} Nov 22 03:59:10 crc kubenswrapper[4952]: I1122 03:59:10.796520 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sqkcj"] Nov 22 03:59:10 crc kubenswrapper[4952]: E1122 03:59:10.797588 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e7415a-9606-4a1d-b54f-2ff1d130bc27" containerName="extract-content" Nov 22 03:59:10 crc kubenswrapper[4952]: I1122 03:59:10.797605 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e7415a-9606-4a1d-b54f-2ff1d130bc27" containerName="extract-content" Nov 22 03:59:10 crc kubenswrapper[4952]: E1122 03:59:10.797622 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e7415a-9606-4a1d-b54f-2ff1d130bc27" containerName="registry-server" Nov 22 03:59:10 crc kubenswrapper[4952]: I1122 03:59:10.797631 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e7415a-9606-4a1d-b54f-2ff1d130bc27" containerName="registry-server" Nov 22 03:59:10 crc kubenswrapper[4952]: E1122 03:59:10.797659 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e7415a-9606-4a1d-b54f-2ff1d130bc27" containerName="extract-utilities" Nov 22 03:59:10 crc kubenswrapper[4952]: I1122 03:59:10.797668 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e7415a-9606-4a1d-b54f-2ff1d130bc27" containerName="extract-utilities" Nov 22 03:59:10 crc kubenswrapper[4952]: I1122 03:59:10.797946 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4e7415a-9606-4a1d-b54f-2ff1d130bc27" containerName="registry-server" Nov 22 03:59:10 crc kubenswrapper[4952]: I1122 03:59:10.799804 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqkcj" Nov 22 03:59:10 crc kubenswrapper[4952]: I1122 03:59:10.824952 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqkcj"] Nov 22 03:59:10 crc kubenswrapper[4952]: I1122 03:59:10.970587 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b725abb-6a6e-4c18-aba8-c039f3ebe0f2-catalog-content\") pod \"redhat-marketplace-sqkcj\" (UID: \"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2\") " pod="openshift-marketplace/redhat-marketplace-sqkcj" Nov 22 03:59:10 crc kubenswrapper[4952]: I1122 03:59:10.970632 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b725abb-6a6e-4c18-aba8-c039f3ebe0f2-utilities\") pod \"redhat-marketplace-sqkcj\" (UID: \"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2\") " pod="openshift-marketplace/redhat-marketplace-sqkcj" Nov 22 03:59:10 crc kubenswrapper[4952]: I1122 03:59:10.970675 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95qgr\" (UniqueName: \"kubernetes.io/projected/9b725abb-6a6e-4c18-aba8-c039f3ebe0f2-kube-api-access-95qgr\") pod \"redhat-marketplace-sqkcj\" (UID: \"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2\") " pod="openshift-marketplace/redhat-marketplace-sqkcj" Nov 22 03:59:11 crc kubenswrapper[4952]: I1122 03:59:11.072915 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b725abb-6a6e-4c18-aba8-c039f3ebe0f2-catalog-content\") pod \"redhat-marketplace-sqkcj\" (UID: \"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2\") " pod="openshift-marketplace/redhat-marketplace-sqkcj" Nov 22 03:59:11 crc kubenswrapper[4952]: I1122 03:59:11.072965 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b725abb-6a6e-4c18-aba8-c039f3ebe0f2-utilities\") pod \"redhat-marketplace-sqkcj\" (UID: \"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2\") " pod="openshift-marketplace/redhat-marketplace-sqkcj" Nov 22 03:59:11 crc kubenswrapper[4952]: I1122 03:59:11.073006 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95qgr\" (UniqueName: \"kubernetes.io/projected/9b725abb-6a6e-4c18-aba8-c039f3ebe0f2-kube-api-access-95qgr\") pod \"redhat-marketplace-sqkcj\" (UID: \"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2\") " pod="openshift-marketplace/redhat-marketplace-sqkcj" Nov 22 03:59:11 crc kubenswrapper[4952]: I1122 03:59:11.073991 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b725abb-6a6e-4c18-aba8-c039f3ebe0f2-catalog-content\") pod \"redhat-marketplace-sqkcj\" (UID: \"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2\") " pod="openshift-marketplace/redhat-marketplace-sqkcj" Nov 22 03:59:11 crc kubenswrapper[4952]: I1122 03:59:11.074279 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b725abb-6a6e-4c18-aba8-c039f3ebe0f2-utilities\") pod \"redhat-marketplace-sqkcj\" (UID: \"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2\") " pod="openshift-marketplace/redhat-marketplace-sqkcj" Nov 22 03:59:11 crc kubenswrapper[4952]: I1122 03:59:11.120782 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95qgr\" (UniqueName: \"kubernetes.io/projected/9b725abb-6a6e-4c18-aba8-c039f3ebe0f2-kube-api-access-95qgr\") pod \"redhat-marketplace-sqkcj\" (UID: \"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2\") " pod="openshift-marketplace/redhat-marketplace-sqkcj" Nov 22 03:59:11 crc kubenswrapper[4952]: I1122 03:59:11.124742 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqkcj" Nov 22 03:59:11 crc kubenswrapper[4952]: I1122 03:59:11.720897 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqkcj"] Nov 22 03:59:12 crc kubenswrapper[4952]: I1122 03:59:12.059440 4952 generic.go:334] "Generic (PLEG): container finished" podID="9b725abb-6a6e-4c18-aba8-c039f3ebe0f2" containerID="ae95edc8c3b7cf83e7e3f15c59abfe0e778ab9a25120f20e9257da59dd2dcc24" exitCode=0 Nov 22 03:59:12 crc kubenswrapper[4952]: I1122 03:59:12.059528 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqkcj" event={"ID":"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2","Type":"ContainerDied","Data":"ae95edc8c3b7cf83e7e3f15c59abfe0e778ab9a25120f20e9257da59dd2dcc24"} Nov 22 03:59:12 crc kubenswrapper[4952]: I1122 03:59:12.059812 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqkcj" event={"ID":"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2","Type":"ContainerStarted","Data":"e346fe24c9cf670a2dc491f0f2e2dd1c16fe699beac7a1e9d82ff6b172332639"} Nov 22 03:59:12 crc kubenswrapper[4952]: I1122 03:59:12.062016 4952 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 03:59:14 crc kubenswrapper[4952]: I1122 03:59:14.076988 4952 generic.go:334] "Generic (PLEG): container finished" podID="9b725abb-6a6e-4c18-aba8-c039f3ebe0f2" containerID="22133ce8f958e0835f5776c77014a7fadcc241dbede68fd7caa31a810e6f8710" exitCode=0 Nov 22 03:59:14 crc kubenswrapper[4952]: I1122 03:59:14.077068 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqkcj" event={"ID":"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2","Type":"ContainerDied","Data":"22133ce8f958e0835f5776c77014a7fadcc241dbede68fd7caa31a810e6f8710"} Nov 22 03:59:15 crc kubenswrapper[4952]: I1122 03:59:15.090451 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqkcj" event={"ID":"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2","Type":"ContainerStarted","Data":"361faa7e6d2211f4c577e2bc76f3527c453829e71d2bd4bd7bc00c4c3e68e5bc"} Nov 22 03:59:15 crc kubenswrapper[4952]: I1122 03:59:15.113747 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sqkcj" podStartSLOduration=2.635933669 podStartE2EDuration="5.113730916s" podCreationTimestamp="2025-11-22 03:59:10 +0000 UTC" firstStartedPulling="2025-11-22 03:59:12.061702967 +0000 UTC m=+3916.367720250" lastFinishedPulling="2025-11-22 03:59:14.539500224 +0000 UTC m=+3918.845517497" observedRunningTime="2025-11-22 03:59:15.111090036 +0000 UTC m=+3919.417107329" watchObservedRunningTime="2025-11-22 03:59:15.113730916 +0000 UTC m=+3919.419748189" Nov 22 03:59:21 crc kubenswrapper[4952]: I1122 03:59:21.125724 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sqkcj" Nov 22 03:59:21 crc kubenswrapper[4952]: I1122 03:59:21.126336 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sqkcj" Nov 22 03:59:21 crc kubenswrapper[4952]: I1122 03:59:21.209705 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sqkcj" Nov 22 03:59:21 crc kubenswrapper[4952]: I1122 03:59:21.292948 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sqkcj" Nov 22 03:59:21 crc kubenswrapper[4952]: I1122 03:59:21.450631 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqkcj"] Nov 22 03:59:23 crc kubenswrapper[4952]: I1122 03:59:23.182916 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sqkcj" podUID="9b725abb-6a6e-4c18-aba8-c039f3ebe0f2" containerName="registry-server" containerID="cri-o://361faa7e6d2211f4c577e2bc76f3527c453829e71d2bd4bd7bc00c4c3e68e5bc" gracePeriod=2 Nov 22 03:59:23 crc kubenswrapper[4952]: I1122 03:59:23.739980 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqkcj" Nov 22 03:59:23 crc kubenswrapper[4952]: I1122 03:59:23.851204 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b725abb-6a6e-4c18-aba8-c039f3ebe0f2-catalog-content\") pod \"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2\" (UID: \"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2\") " Nov 22 03:59:23 crc kubenswrapper[4952]: I1122 03:59:23.851262 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b725abb-6a6e-4c18-aba8-c039f3ebe0f2-utilities\") pod \"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2\" (UID: \"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2\") " Nov 22 03:59:23 crc kubenswrapper[4952]: I1122 03:59:23.851303 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95qgr\" (UniqueName: \"kubernetes.io/projected/9b725abb-6a6e-4c18-aba8-c039f3ebe0f2-kube-api-access-95qgr\") pod \"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2\" (UID: \"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2\") " Nov 22 03:59:23 crc kubenswrapper[4952]: I1122 03:59:23.853085 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b725abb-6a6e-4c18-aba8-c039f3ebe0f2-utilities" (OuterVolumeSpecName: "utilities") pod "9b725abb-6a6e-4c18-aba8-c039f3ebe0f2" (UID: "9b725abb-6a6e-4c18-aba8-c039f3ebe0f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:59:23 crc kubenswrapper[4952]: I1122 03:59:23.874158 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b725abb-6a6e-4c18-aba8-c039f3ebe0f2-kube-api-access-95qgr" (OuterVolumeSpecName: "kube-api-access-95qgr") pod "9b725abb-6a6e-4c18-aba8-c039f3ebe0f2" (UID: "9b725abb-6a6e-4c18-aba8-c039f3ebe0f2"). InnerVolumeSpecName "kube-api-access-95qgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:59:23 crc kubenswrapper[4952]: I1122 03:59:23.891415 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b725abb-6a6e-4c18-aba8-c039f3ebe0f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b725abb-6a6e-4c18-aba8-c039f3ebe0f2" (UID: "9b725abb-6a6e-4c18-aba8-c039f3ebe0f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:59:23 crc kubenswrapper[4952]: I1122 03:59:23.953647 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b725abb-6a6e-4c18-aba8-c039f3ebe0f2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:59:23 crc kubenswrapper[4952]: I1122 03:59:23.953854 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b725abb-6a6e-4c18-aba8-c039f3ebe0f2-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:59:23 crc kubenswrapper[4952]: I1122 03:59:23.953935 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95qgr\" (UniqueName: \"kubernetes.io/projected/9b725abb-6a6e-4c18-aba8-c039f3ebe0f2-kube-api-access-95qgr\") on node \"crc\" DevicePath \"\"" Nov 22 03:59:24 crc kubenswrapper[4952]: I1122 03:59:24.199005 4952 generic.go:334] "Generic (PLEG): container finished" podID="9b725abb-6a6e-4c18-aba8-c039f3ebe0f2" containerID="361faa7e6d2211f4c577e2bc76f3527c453829e71d2bd4bd7bc00c4c3e68e5bc" exitCode=0 Nov 22 03:59:24 crc kubenswrapper[4952]: I1122 03:59:24.199097 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqkcj" event={"ID":"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2","Type":"ContainerDied","Data":"361faa7e6d2211f4c577e2bc76f3527c453829e71d2bd4bd7bc00c4c3e68e5bc"} Nov 22 03:59:24 crc kubenswrapper[4952]: I1122 03:59:24.199352 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqkcj" event={"ID":"9b725abb-6a6e-4c18-aba8-c039f3ebe0f2","Type":"ContainerDied","Data":"e346fe24c9cf670a2dc491f0f2e2dd1c16fe699beac7a1e9d82ff6b172332639"} Nov 22 03:59:24 crc kubenswrapper[4952]: I1122 03:59:24.199386 4952 scope.go:117] "RemoveContainer" containerID="361faa7e6d2211f4c577e2bc76f3527c453829e71d2bd4bd7bc00c4c3e68e5bc" Nov 22 03:59:24 crc kubenswrapper[4952]: I1122 03:59:24.199116 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqkcj" Nov 22 03:59:24 crc kubenswrapper[4952]: I1122 03:59:24.234373 4952 scope.go:117] "RemoveContainer" containerID="22133ce8f958e0835f5776c77014a7fadcc241dbede68fd7caa31a810e6f8710" Nov 22 03:59:24 crc kubenswrapper[4952]: I1122 03:59:24.253607 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqkcj"] Nov 22 03:59:24 crc kubenswrapper[4952]: I1122 03:59:24.266440 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqkcj"] Nov 22 03:59:24 crc kubenswrapper[4952]: I1122 03:59:24.295856 4952 scope.go:117] "RemoveContainer" containerID="ae95edc8c3b7cf83e7e3f15c59abfe0e778ab9a25120f20e9257da59dd2dcc24" Nov 22 03:59:24 crc kubenswrapper[4952]: I1122 03:59:24.321197 4952 scope.go:117] "RemoveContainer" containerID="361faa7e6d2211f4c577e2bc76f3527c453829e71d2bd4bd7bc00c4c3e68e5bc" Nov 22 03:59:24 crc kubenswrapper[4952]: E1122 03:59:24.321861 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"361faa7e6d2211f4c577e2bc76f3527c453829e71d2bd4bd7bc00c4c3e68e5bc\": container with ID starting with 361faa7e6d2211f4c577e2bc76f3527c453829e71d2bd4bd7bc00c4c3e68e5bc not found: ID does not exist" containerID="361faa7e6d2211f4c577e2bc76f3527c453829e71d2bd4bd7bc00c4c3e68e5bc" Nov 22 03:59:24 crc kubenswrapper[4952]: I1122 03:59:24.321912 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"361faa7e6d2211f4c577e2bc76f3527c453829e71d2bd4bd7bc00c4c3e68e5bc"} err="failed to get container status \"361faa7e6d2211f4c577e2bc76f3527c453829e71d2bd4bd7bc00c4c3e68e5bc\": rpc error: code = NotFound desc = could not find container \"361faa7e6d2211f4c577e2bc76f3527c453829e71d2bd4bd7bc00c4c3e68e5bc\": container with ID starting with 361faa7e6d2211f4c577e2bc76f3527c453829e71d2bd4bd7bc00c4c3e68e5bc not found: ID does not exist" Nov 22 03:59:24 crc kubenswrapper[4952]: I1122 03:59:24.321952 4952 scope.go:117] "RemoveContainer" containerID="22133ce8f958e0835f5776c77014a7fadcc241dbede68fd7caa31a810e6f8710" Nov 22 03:59:24 crc kubenswrapper[4952]: E1122 03:59:24.322305 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22133ce8f958e0835f5776c77014a7fadcc241dbede68fd7caa31a810e6f8710\": container with ID starting with 22133ce8f958e0835f5776c77014a7fadcc241dbede68fd7caa31a810e6f8710 not found: ID does not exist" containerID="22133ce8f958e0835f5776c77014a7fadcc241dbede68fd7caa31a810e6f8710" Nov 22 03:59:24 crc kubenswrapper[4952]: I1122 03:59:24.322347 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22133ce8f958e0835f5776c77014a7fadcc241dbede68fd7caa31a810e6f8710"} err="failed to get container status \"22133ce8f958e0835f5776c77014a7fadcc241dbede68fd7caa31a810e6f8710\": rpc error: code = NotFound desc = could not find container \"22133ce8f958e0835f5776c77014a7fadcc241dbede68fd7caa31a810e6f8710\": container with ID starting with 22133ce8f958e0835f5776c77014a7fadcc241dbede68fd7caa31a810e6f8710 not found: ID does not exist" Nov 22 03:59:24 crc kubenswrapper[4952]: I1122 03:59:24.322372 4952 scope.go:117] "RemoveContainer" containerID="ae95edc8c3b7cf83e7e3f15c59abfe0e778ab9a25120f20e9257da59dd2dcc24" Nov 22 03:59:24 crc kubenswrapper[4952]: E1122 03:59:24.322777 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae95edc8c3b7cf83e7e3f15c59abfe0e778ab9a25120f20e9257da59dd2dcc24\": container with ID starting with ae95edc8c3b7cf83e7e3f15c59abfe0e778ab9a25120f20e9257da59dd2dcc24 not found: ID does not exist" containerID="ae95edc8c3b7cf83e7e3f15c59abfe0e778ab9a25120f20e9257da59dd2dcc24" Nov 22 03:59:24 crc kubenswrapper[4952]: I1122 03:59:24.322810 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae95edc8c3b7cf83e7e3f15c59abfe0e778ab9a25120f20e9257da59dd2dcc24"} err="failed to get container status \"ae95edc8c3b7cf83e7e3f15c59abfe0e778ab9a25120f20e9257da59dd2dcc24\": rpc error: code = NotFound desc = could not find container \"ae95edc8c3b7cf83e7e3f15c59abfe0e778ab9a25120f20e9257da59dd2dcc24\": container with ID starting with ae95edc8c3b7cf83e7e3f15c59abfe0e778ab9a25120f20e9257da59dd2dcc24 not found: ID does not exist" Nov 22 03:59:24 crc kubenswrapper[4952]: I1122 03:59:24.545981 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b725abb-6a6e-4c18-aba8-c039f3ebe0f2" path="/var/lib/kubelet/pods/9b725abb-6a6e-4c18-aba8-c039f3ebe0f2/volumes" Nov 22 03:59:54 crc kubenswrapper[4952]: I1122 03:59:54.499005 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q9lqv"] Nov 22 03:59:54 crc kubenswrapper[4952]: E1122 03:59:54.499964 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b725abb-6a6e-4c18-aba8-c039f3ebe0f2" containerName="extract-content" Nov 22 03:59:54 crc kubenswrapper[4952]: I1122 03:59:54.499979 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b725abb-6a6e-4c18-aba8-c039f3ebe0f2" containerName="extract-content" Nov 22 03:59:54 crc kubenswrapper[4952]: E1122 03:59:54.500001 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b725abb-6a6e-4c18-aba8-c039f3ebe0f2" containerName="registry-server" Nov 22 03:59:54 crc kubenswrapper[4952]: I1122 03:59:54.500009 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b725abb-6a6e-4c18-aba8-c039f3ebe0f2" containerName="registry-server" Nov 22 03:59:54 crc kubenswrapper[4952]: E1122 03:59:54.500023 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b725abb-6a6e-4c18-aba8-c039f3ebe0f2" containerName="extract-utilities" Nov 22 03:59:54 crc kubenswrapper[4952]: I1122 03:59:54.500033 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b725abb-6a6e-4c18-aba8-c039f3ebe0f2" containerName="extract-utilities" Nov 22 03:59:54 crc kubenswrapper[4952]: I1122 03:59:54.500271 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b725abb-6a6e-4c18-aba8-c039f3ebe0f2" containerName="registry-server" Nov 22 03:59:54 crc kubenswrapper[4952]: I1122 03:59:54.502120 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q9lqv" Nov 22 03:59:54 crc kubenswrapper[4952]: I1122 03:59:54.550929 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q9lqv"] Nov 22 03:59:54 crc kubenswrapper[4952]: I1122 03:59:54.631092 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hbtp\" (UniqueName: \"kubernetes.io/projected/b5ac067f-9f6a-4f05-9805-33af15b378a1-kube-api-access-2hbtp\") pod \"certified-operators-q9lqv\" (UID: \"b5ac067f-9f6a-4f05-9805-33af15b378a1\") " pod="openshift-marketplace/certified-operators-q9lqv" Nov 22 03:59:54 crc kubenswrapper[4952]: I1122 03:59:54.631404 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5ac067f-9f6a-4f05-9805-33af15b378a1-catalog-content\") pod \"certified-operators-q9lqv\" (UID: \"b5ac067f-9f6a-4f05-9805-33af15b378a1\") " pod="openshift-marketplace/certified-operators-q9lqv" Nov 22 03:59:54 crc kubenswrapper[4952]: I1122 03:59:54.632171 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5ac067f-9f6a-4f05-9805-33af15b378a1-utilities\") pod \"certified-operators-q9lqv\" (UID: \"b5ac067f-9f6a-4f05-9805-33af15b378a1\") " pod="openshift-marketplace/certified-operators-q9lqv" Nov 22 03:59:54 crc kubenswrapper[4952]: I1122 03:59:54.734949 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5ac067f-9f6a-4f05-9805-33af15b378a1-catalog-content\") pod \"certified-operators-q9lqv\" (UID: \"b5ac067f-9f6a-4f05-9805-33af15b378a1\") " pod="openshift-marketplace/certified-operators-q9lqv" Nov 22 03:59:54 crc kubenswrapper[4952]: I1122 03:59:54.735397 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5ac067f-9f6a-4f05-9805-33af15b378a1-utilities\") pod \"certified-operators-q9lqv\" (UID: \"b5ac067f-9f6a-4f05-9805-33af15b378a1\") " pod="openshift-marketplace/certified-operators-q9lqv" Nov 22 03:59:54 crc kubenswrapper[4952]: I1122 03:59:54.735516 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hbtp\" (UniqueName: \"kubernetes.io/projected/b5ac067f-9f6a-4f05-9805-33af15b378a1-kube-api-access-2hbtp\") pod \"certified-operators-q9lqv\" (UID: \"b5ac067f-9f6a-4f05-9805-33af15b378a1\") " pod="openshift-marketplace/certified-operators-q9lqv" Nov 22 03:59:54 crc kubenswrapper[4952]: I1122 03:59:54.735912 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5ac067f-9f6a-4f05-9805-33af15b378a1-catalog-content\") pod \"certified-operators-q9lqv\" (UID: \"b5ac067f-9f6a-4f05-9805-33af15b378a1\") " pod="openshift-marketplace/certified-operators-q9lqv" Nov 22 03:59:54 crc kubenswrapper[4952]: I1122 03:59:54.735993 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5ac067f-9f6a-4f05-9805-33af15b378a1-utilities\") pod \"certified-operators-q9lqv\" (UID: \"b5ac067f-9f6a-4f05-9805-33af15b378a1\") " pod="openshift-marketplace/certified-operators-q9lqv" Nov 22 03:59:54 crc kubenswrapper[4952]: I1122 03:59:54.774507 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hbtp\" (UniqueName: \"kubernetes.io/projected/b5ac067f-9f6a-4f05-9805-33af15b378a1-kube-api-access-2hbtp\") pod \"certified-operators-q9lqv\" (UID: \"b5ac067f-9f6a-4f05-9805-33af15b378a1\") " pod="openshift-marketplace/certified-operators-q9lqv" Nov 22 03:59:54 crc kubenswrapper[4952]: I1122 03:59:54.832041 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q9lqv" Nov 22 03:59:55 crc kubenswrapper[4952]: I1122 03:59:55.298327 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q9lqv"] Nov 22 03:59:55 crc kubenswrapper[4952]: I1122 03:59:55.544009 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9lqv" event={"ID":"b5ac067f-9f6a-4f05-9805-33af15b378a1","Type":"ContainerStarted","Data":"24955a0efba566dd38b429b6f0c5490a1452505d0c6323db03b9dbc0bd271bd4"} Nov 22 03:59:55 crc kubenswrapper[4952]: I1122 03:59:55.544294 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9lqv" event={"ID":"b5ac067f-9f6a-4f05-9805-33af15b378a1","Type":"ContainerStarted","Data":"655d56d47123efc4fa4a313fe44f1ee97564fad3a357dcbc1ad1c045a02e4ec7"} Nov 22 03:59:56 crc kubenswrapper[4952]: I1122 03:59:56.554212 4952 generic.go:334] "Generic (PLEG): container finished" podID="b5ac067f-9f6a-4f05-9805-33af15b378a1" containerID="24955a0efba566dd38b429b6f0c5490a1452505d0c6323db03b9dbc0bd271bd4" exitCode=0 Nov 22 03:59:56 crc kubenswrapper[4952]: I1122 03:59:56.554525 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9lqv" event={"ID":"b5ac067f-9f6a-4f05-9805-33af15b378a1","Type":"ContainerDied","Data":"24955a0efba566dd38b429b6f0c5490a1452505d0c6323db03b9dbc0bd271bd4"} Nov 22 03:59:57 crc kubenswrapper[4952]: I1122 03:59:57.567204 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9lqv" event={"ID":"b5ac067f-9f6a-4f05-9805-33af15b378a1","Type":"ContainerStarted","Data":"e093f5289a31edd86cb37b85c7e27e6c2139b21171266fba9054980f36e389e8"} Nov 22 03:59:58 crc kubenswrapper[4952]: I1122 03:59:58.577657 4952 generic.go:334] "Generic (PLEG): container finished" podID="b5ac067f-9f6a-4f05-9805-33af15b378a1" containerID="e093f5289a31edd86cb37b85c7e27e6c2139b21171266fba9054980f36e389e8" exitCode=0 Nov 22 03:59:58 crc kubenswrapper[4952]: I1122 03:59:58.577883 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9lqv" event={"ID":"b5ac067f-9f6a-4f05-9805-33af15b378a1","Type":"ContainerDied","Data":"e093f5289a31edd86cb37b85c7e27e6c2139b21171266fba9054980f36e389e8"} Nov 22 03:59:59 crc kubenswrapper[4952]: I1122 03:59:59.589304 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9lqv" event={"ID":"b5ac067f-9f6a-4f05-9805-33af15b378a1","Type":"ContainerStarted","Data":"9fcca2a47dcebbb13dcbc74d03373334300646e5c810bbf519aa654e4d0376fe"} Nov 22 03:59:59 crc kubenswrapper[4952]: I1122 03:59:59.616328 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q9lqv" podStartSLOduration=3.1299551 podStartE2EDuration="5.616296781s" podCreationTimestamp="2025-11-22 03:59:54 +0000 UTC" firstStartedPulling="2025-11-22 03:59:56.558351994 +0000 UTC m=+3960.864369277" lastFinishedPulling="2025-11-22 03:59:59.044693675 +0000 UTC m=+3963.350710958" observedRunningTime="2025-11-22 03:59:59.610402764 +0000 UTC m=+3963.916420047" watchObservedRunningTime="2025-11-22 03:59:59.616296781 +0000 UTC m=+3963.922314064" Nov 22 04:00:00 crc kubenswrapper[4952]: I1122 04:00:00.146175 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj"] Nov 22 04:00:00 crc kubenswrapper[4952]: I1122 04:00:00.147834 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj" Nov 22 04:00:00 crc kubenswrapper[4952]: I1122 04:00:00.149975 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 04:00:00 crc kubenswrapper[4952]: I1122 04:00:00.150620 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 04:00:00 crc kubenswrapper[4952]: I1122 04:00:00.163882 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj"] Nov 22 04:00:00 crc kubenswrapper[4952]: I1122 04:00:00.269535 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd-config-volume\") pod \"collect-profiles-29396400-ng5fj\" (UID: \"8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj" Nov 22 04:00:00 crc kubenswrapper[4952]: I1122 04:00:00.269675 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd-secret-volume\") pod \"collect-profiles-29396400-ng5fj\" (UID: \"8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj" Nov 22 04:00:00 crc kubenswrapper[4952]: I1122 04:00:00.270195 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft5mx\" (UniqueName: \"kubernetes.io/projected/8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd-kube-api-access-ft5mx\") pod \"collect-profiles-29396400-ng5fj\" (UID: \"8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj" Nov 22 04:00:00 crc kubenswrapper[4952]: I1122 04:00:00.372737 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft5mx\" (UniqueName: \"kubernetes.io/projected/8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd-kube-api-access-ft5mx\") pod \"collect-profiles-29396400-ng5fj\" (UID: \"8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj" Nov 22 04:00:00 crc kubenswrapper[4952]: I1122 04:00:00.372813 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd-config-volume\") pod \"collect-profiles-29396400-ng5fj\" (UID: \"8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj" Nov 22 04:00:00 crc kubenswrapper[4952]: I1122 04:00:00.372848 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd-secret-volume\") pod \"collect-profiles-29396400-ng5fj\" (UID: \"8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj" Nov 22 04:00:00 crc kubenswrapper[4952]: I1122 04:00:00.373931 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd-config-volume\") pod \"collect-profiles-29396400-ng5fj\" (UID: \"8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj" Nov 22 04:00:00 crc kubenswrapper[4952]: I1122 04:00:00.385950 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd-secret-volume\") pod \"collect-profiles-29396400-ng5fj\" (UID: \"8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj" Nov 22 04:00:00 crc kubenswrapper[4952]: I1122 04:00:00.395244 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft5mx\" (UniqueName: \"kubernetes.io/projected/8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd-kube-api-access-ft5mx\") pod \"collect-profiles-29396400-ng5fj\" (UID: \"8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj" Nov 22 04:00:00 crc kubenswrapper[4952]: I1122 04:00:00.521207 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj" Nov 22 04:00:00 crc kubenswrapper[4952]: I1122 04:00:00.984627 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj"] Nov 22 04:00:01 crc kubenswrapper[4952]: I1122 04:00:01.609643 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj" event={"ID":"8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd","Type":"ContainerStarted","Data":"8e841337c29de0a583de85c5d01dcfa4627ab09d235a7d28c325e1cd8cc913ac"} Nov 22 04:00:01 crc kubenswrapper[4952]: I1122 04:00:01.610214 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj" event={"ID":"8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd","Type":"ContainerStarted","Data":"081115b4fc0517710c562960de285f06ae09f7259e50dfd701a96cfa54b0d888"} Nov 22 04:00:01 crc kubenswrapper[4952]: I1122 04:00:01.635926 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj" podStartSLOduration=1.6359078679999999 podStartE2EDuration="1.635907868s" podCreationTimestamp="2025-11-22 04:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:00:01.627248739 +0000 UTC m=+3965.933266002" watchObservedRunningTime="2025-11-22 04:00:01.635907868 +0000 UTC m=+3965.941925151" Nov 22 04:00:02 crc kubenswrapper[4952]: I1122 04:00:02.622687 4952 generic.go:334] "Generic (PLEG): container finished" podID="8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd" containerID="8e841337c29de0a583de85c5d01dcfa4627ab09d235a7d28c325e1cd8cc913ac" exitCode=0 Nov 22 04:00:02 crc kubenswrapper[4952]: I1122 04:00:02.623071 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj" event={"ID":"8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd","Type":"ContainerDied","Data":"8e841337c29de0a583de85c5d01dcfa4627ab09d235a7d28c325e1cd8cc913ac"} Nov 22 04:00:04 crc kubenswrapper[4952]: I1122 04:00:04.244712 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj" Nov 22 04:00:04 crc kubenswrapper[4952]: I1122 04:00:04.353195 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd-config-volume\") pod \"8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd\" (UID: \"8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd\") " Nov 22 04:00:04 crc kubenswrapper[4952]: I1122 04:00:04.353274 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd-secret-volume\") pod \"8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd\" (UID: \"8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd\") " Nov 22 04:00:04 crc kubenswrapper[4952]: I1122 04:00:04.353389 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft5mx\" (UniqueName: \"kubernetes.io/projected/8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd-kube-api-access-ft5mx\") pod \"8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd\" (UID: \"8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd\") " Nov 22 04:00:04 crc kubenswrapper[4952]: I1122 04:00:04.354203 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd-config-volume" (OuterVolumeSpecName: "config-volume") pod "8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd" (UID: "8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:00:04 crc kubenswrapper[4952]: I1122 04:00:04.364712 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd" (UID: "8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:00:04 crc kubenswrapper[4952]: I1122 04:00:04.364748 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd-kube-api-access-ft5mx" (OuterVolumeSpecName: "kube-api-access-ft5mx") pod "8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd" (UID: "8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd"). InnerVolumeSpecName "kube-api-access-ft5mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:00:04 crc kubenswrapper[4952]: I1122 04:00:04.455937 4952 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:00:04 crc kubenswrapper[4952]: I1122 04:00:04.455985 4952 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:00:04 crc kubenswrapper[4952]: I1122 04:00:04.456004 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft5mx\" (UniqueName: \"kubernetes.io/projected/8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd-kube-api-access-ft5mx\") on node \"crc\" DevicePath \"\"" Nov 22 04:00:04 crc kubenswrapper[4952]: I1122 04:00:04.643079 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj" event={"ID":"8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd","Type":"ContainerDied","Data":"081115b4fc0517710c562960de285f06ae09f7259e50dfd701a96cfa54b0d888"} Nov 22 04:00:04 crc kubenswrapper[4952]: I1122 04:00:04.643138 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="081115b4fc0517710c562960de285f06ae09f7259e50dfd701a96cfa54b0d888" Nov 22 04:00:04 crc kubenswrapper[4952]: I1122 04:00:04.643156 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-ng5fj" Nov 22 04:00:04 crc kubenswrapper[4952]: I1122 04:00:04.707801 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6"] Nov 22 04:00:04 crc kubenswrapper[4952]: I1122 04:00:04.714826 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396355-9qxl6"] Nov 22 04:00:04 crc kubenswrapper[4952]: I1122 04:00:04.833094 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q9lqv" Nov 22 04:00:04 crc kubenswrapper[4952]: I1122 04:00:04.833411 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q9lqv" Nov 22 04:00:04 crc kubenswrapper[4952]: I1122 04:00:04.882894 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q9lqv" Nov 22 04:00:05 crc kubenswrapper[4952]: I1122 04:00:05.721717 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q9lqv" Nov 22 04:00:05 crc kubenswrapper[4952]: I1122 04:00:05.777250 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q9lqv"] Nov 22 04:00:06 crc kubenswrapper[4952]: I1122 04:00:06.562056 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab7d465-e283-4f05-aa69-8ba55c10a609" path="/var/lib/kubelet/pods/2ab7d465-e283-4f05-aa69-8ba55c10a609/volumes" Nov 22 04:00:07 crc kubenswrapper[4952]: I1122 04:00:07.672511 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q9lqv" podUID="b5ac067f-9f6a-4f05-9805-33af15b378a1" containerName="registry-server" containerID="cri-o://9fcca2a47dcebbb13dcbc74d03373334300646e5c810bbf519aa654e4d0376fe" gracePeriod=2 Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.289102 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q9lqv" Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.449251 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hbtp\" (UniqueName: \"kubernetes.io/projected/b5ac067f-9f6a-4f05-9805-33af15b378a1-kube-api-access-2hbtp\") pod \"b5ac067f-9f6a-4f05-9805-33af15b378a1\" (UID: \"b5ac067f-9f6a-4f05-9805-33af15b378a1\") " Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.449370 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5ac067f-9f6a-4f05-9805-33af15b378a1-utilities\") pod \"b5ac067f-9f6a-4f05-9805-33af15b378a1\" (UID: \"b5ac067f-9f6a-4f05-9805-33af15b378a1\") " Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.449448 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5ac067f-9f6a-4f05-9805-33af15b378a1-catalog-content\") pod \"b5ac067f-9f6a-4f05-9805-33af15b378a1\" (UID: \"b5ac067f-9f6a-4f05-9805-33af15b378a1\") " Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.450645 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5ac067f-9f6a-4f05-9805-33af15b378a1-utilities" (OuterVolumeSpecName: "utilities") pod "b5ac067f-9f6a-4f05-9805-33af15b378a1" (UID: "b5ac067f-9f6a-4f05-9805-33af15b378a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.454496 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5ac067f-9f6a-4f05-9805-33af15b378a1-kube-api-access-2hbtp" (OuterVolumeSpecName: "kube-api-access-2hbtp") pod "b5ac067f-9f6a-4f05-9805-33af15b378a1" (UID: "b5ac067f-9f6a-4f05-9805-33af15b378a1"). InnerVolumeSpecName "kube-api-access-2hbtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.504024 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5ac067f-9f6a-4f05-9805-33af15b378a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5ac067f-9f6a-4f05-9805-33af15b378a1" (UID: "b5ac067f-9f6a-4f05-9805-33af15b378a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.552321 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5ac067f-9f6a-4f05-9805-33af15b378a1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.552356 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hbtp\" (UniqueName: \"kubernetes.io/projected/b5ac067f-9f6a-4f05-9805-33af15b378a1-kube-api-access-2hbtp\") on node \"crc\" DevicePath \"\"" Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.552369 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5ac067f-9f6a-4f05-9805-33af15b378a1-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.684358 4952 generic.go:334] "Generic (PLEG): container finished" podID="b5ac067f-9f6a-4f05-9805-33af15b378a1" containerID="9fcca2a47dcebbb13dcbc74d03373334300646e5c810bbf519aa654e4d0376fe" exitCode=0 Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.684407 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9lqv" event={"ID":"b5ac067f-9f6a-4f05-9805-33af15b378a1","Type":"ContainerDied","Data":"9fcca2a47dcebbb13dcbc74d03373334300646e5c810bbf519aa654e4d0376fe"} Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.684448 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9lqv" event={"ID":"b5ac067f-9f6a-4f05-9805-33af15b378a1","Type":"ContainerDied","Data":"655d56d47123efc4fa4a313fe44f1ee97564fad3a357dcbc1ad1c045a02e4ec7"} Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.684469 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q9lqv" Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.684508 4952 scope.go:117] "RemoveContainer" containerID="9fcca2a47dcebbb13dcbc74d03373334300646e5c810bbf519aa654e4d0376fe" Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.712775 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q9lqv"] Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.717071 4952 scope.go:117] "RemoveContainer" containerID="e093f5289a31edd86cb37b85c7e27e6c2139b21171266fba9054980f36e389e8" Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.720473 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q9lqv"] Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.742264 4952 scope.go:117] "RemoveContainer" containerID="24955a0efba566dd38b429b6f0c5490a1452505d0c6323db03b9dbc0bd271bd4" Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.794631 4952 scope.go:117] "RemoveContainer" containerID="9fcca2a47dcebbb13dcbc74d03373334300646e5c810bbf519aa654e4d0376fe" Nov 22 04:00:08 crc kubenswrapper[4952]: E1122 04:00:08.795158 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fcca2a47dcebbb13dcbc74d03373334300646e5c810bbf519aa654e4d0376fe\": container with ID starting with 9fcca2a47dcebbb13dcbc74d03373334300646e5c810bbf519aa654e4d0376fe not found: ID does not exist" containerID="9fcca2a47dcebbb13dcbc74d03373334300646e5c810bbf519aa654e4d0376fe" Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.795213 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fcca2a47dcebbb13dcbc74d03373334300646e5c810bbf519aa654e4d0376fe"} err="failed to get container status \"9fcca2a47dcebbb13dcbc74d03373334300646e5c810bbf519aa654e4d0376fe\": rpc error: code = NotFound desc = could not find container \"9fcca2a47dcebbb13dcbc74d03373334300646e5c810bbf519aa654e4d0376fe\": container with ID starting with 9fcca2a47dcebbb13dcbc74d03373334300646e5c810bbf519aa654e4d0376fe not found: ID does not exist" Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.795242 4952 scope.go:117] "RemoveContainer" containerID="e093f5289a31edd86cb37b85c7e27e6c2139b21171266fba9054980f36e389e8" Nov 22 04:00:08 crc kubenswrapper[4952]: E1122 04:00:08.795578 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e093f5289a31edd86cb37b85c7e27e6c2139b21171266fba9054980f36e389e8\": container with ID starting with e093f5289a31edd86cb37b85c7e27e6c2139b21171266fba9054980f36e389e8 not found: ID does not exist" containerID="e093f5289a31edd86cb37b85c7e27e6c2139b21171266fba9054980f36e389e8" Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.795631 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e093f5289a31edd86cb37b85c7e27e6c2139b21171266fba9054980f36e389e8"} err="failed to get container status \"e093f5289a31edd86cb37b85c7e27e6c2139b21171266fba9054980f36e389e8\": rpc error: code = NotFound desc = could not find container \"e093f5289a31edd86cb37b85c7e27e6c2139b21171266fba9054980f36e389e8\": container with ID starting with e093f5289a31edd86cb37b85c7e27e6c2139b21171266fba9054980f36e389e8 not found: ID does not exist" Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.795658 4952 scope.go:117] "RemoveContainer" containerID="24955a0efba566dd38b429b6f0c5490a1452505d0c6323db03b9dbc0bd271bd4" Nov 22 04:00:08 crc kubenswrapper[4952]: E1122 04:00:08.795919 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24955a0efba566dd38b429b6f0c5490a1452505d0c6323db03b9dbc0bd271bd4\": container with ID starting with 24955a0efba566dd38b429b6f0c5490a1452505d0c6323db03b9dbc0bd271bd4 not found: ID does not exist" containerID="24955a0efba566dd38b429b6f0c5490a1452505d0c6323db03b9dbc0bd271bd4" Nov 22 04:00:08 crc kubenswrapper[4952]: I1122 04:00:08.795944 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24955a0efba566dd38b429b6f0c5490a1452505d0c6323db03b9dbc0bd271bd4"} err="failed to get container status \"24955a0efba566dd38b429b6f0c5490a1452505d0c6323db03b9dbc0bd271bd4\": rpc error: code = NotFound desc = could not find container \"24955a0efba566dd38b429b6f0c5490a1452505d0c6323db03b9dbc0bd271bd4\": container with ID starting with 24955a0efba566dd38b429b6f0c5490a1452505d0c6323db03b9dbc0bd271bd4 not found: ID does not exist" Nov 22 04:00:10 crc kubenswrapper[4952]: I1122 04:00:10.550899 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5ac067f-9f6a-4f05-9805-33af15b378a1" path="/var/lib/kubelet/pods/b5ac067f-9f6a-4f05-9805-33af15b378a1/volumes" Nov 22 04:00:58 crc kubenswrapper[4952]: I1122 04:00:58.342298 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:00:58 crc kubenswrapper[4952]: I1122 04:00:58.342661 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.157384 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29396401-wntch"] Nov 22 04:01:00 crc kubenswrapper[4952]: E1122 04:01:00.158458 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5ac067f-9f6a-4f05-9805-33af15b378a1" containerName="registry-server" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.158478 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ac067f-9f6a-4f05-9805-33af15b378a1" containerName="registry-server" Nov 22 04:01:00 crc kubenswrapper[4952]: E1122 04:01:00.158498 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5ac067f-9f6a-4f05-9805-33af15b378a1" containerName="extract-content" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.158510 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ac067f-9f6a-4f05-9805-33af15b378a1" containerName="extract-content" Nov 22 04:01:00 crc kubenswrapper[4952]: E1122 04:01:00.158532 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5ac067f-9f6a-4f05-9805-33af15b378a1" containerName="extract-utilities" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.158573 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ac067f-9f6a-4f05-9805-33af15b378a1" containerName="extract-utilities" Nov 22 04:01:00 crc kubenswrapper[4952]: E1122 04:01:00.158605 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd" containerName="collect-profiles" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.158645 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd" containerName="collect-profiles" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.159054 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5ac067f-9f6a-4f05-9805-33af15b378a1" containerName="registry-server" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.159083 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bf2dd8c-ebaa-42f2-bbb5-635f960b87bd" containerName="collect-profiles" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.160114 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29396401-wntch" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.180510 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29396401-wntch"] Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.244845 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdhhk\" (UniqueName: \"kubernetes.io/projected/861a0695-d514-4617-9720-062db08dbae7-kube-api-access-cdhhk\") pod \"keystone-cron-29396401-wntch\" (UID: \"861a0695-d514-4617-9720-062db08dbae7\") " pod="openstack/keystone-cron-29396401-wntch" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.244934 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861a0695-d514-4617-9720-062db08dbae7-combined-ca-bundle\") pod \"keystone-cron-29396401-wntch\" (UID: \"861a0695-d514-4617-9720-062db08dbae7\") " pod="openstack/keystone-cron-29396401-wntch" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.244987 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/861a0695-d514-4617-9720-062db08dbae7-fernet-keys\") pod \"keystone-cron-29396401-wntch\" (UID: \"861a0695-d514-4617-9720-062db08dbae7\") " pod="openstack/keystone-cron-29396401-wntch" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.245045 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861a0695-d514-4617-9720-062db08dbae7-config-data\") pod \"keystone-cron-29396401-wntch\" (UID: \"861a0695-d514-4617-9720-062db08dbae7\") " pod="openstack/keystone-cron-29396401-wntch" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.347028 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdhhk\" (UniqueName: \"kubernetes.io/projected/861a0695-d514-4617-9720-062db08dbae7-kube-api-access-cdhhk\") pod \"keystone-cron-29396401-wntch\" (UID: \"861a0695-d514-4617-9720-062db08dbae7\") " pod="openstack/keystone-cron-29396401-wntch" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.347153 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861a0695-d514-4617-9720-062db08dbae7-combined-ca-bundle\") pod \"keystone-cron-29396401-wntch\" (UID: \"861a0695-d514-4617-9720-062db08dbae7\") " pod="openstack/keystone-cron-29396401-wntch" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.347227 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/861a0695-d514-4617-9720-062db08dbae7-fernet-keys\") pod \"keystone-cron-29396401-wntch\" (UID: \"861a0695-d514-4617-9720-062db08dbae7\") " pod="openstack/keystone-cron-29396401-wntch" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.347316 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861a0695-d514-4617-9720-062db08dbae7-config-data\") pod \"keystone-cron-29396401-wntch\" (UID: \"861a0695-d514-4617-9720-062db08dbae7\") " pod="openstack/keystone-cron-29396401-wntch" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.355574 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861a0695-d514-4617-9720-062db08dbae7-combined-ca-bundle\") pod \"keystone-cron-29396401-wntch\" (UID: \"861a0695-d514-4617-9720-062db08dbae7\") " pod="openstack/keystone-cron-29396401-wntch" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.358523 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861a0695-d514-4617-9720-062db08dbae7-config-data\") pod \"keystone-cron-29396401-wntch\" (UID: \"861a0695-d514-4617-9720-062db08dbae7\") " pod="openstack/keystone-cron-29396401-wntch" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.365256 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/861a0695-d514-4617-9720-062db08dbae7-fernet-keys\") pod \"keystone-cron-29396401-wntch\" (UID: \"861a0695-d514-4617-9720-062db08dbae7\") " pod="openstack/keystone-cron-29396401-wntch" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.366674 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdhhk\" (UniqueName: \"kubernetes.io/projected/861a0695-d514-4617-9720-062db08dbae7-kube-api-access-cdhhk\") pod \"keystone-cron-29396401-wntch\" (UID: \"861a0695-d514-4617-9720-062db08dbae7\") " pod="openstack/keystone-cron-29396401-wntch" Nov 22 04:01:00 crc kubenswrapper[4952]: I1122 04:01:00.497218 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29396401-wntch" Nov 22 04:01:01 crc kubenswrapper[4952]: I1122 04:01:01.002345 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29396401-wntch"] Nov 22 04:01:01 crc kubenswrapper[4952]: W1122 04:01:01.007759 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod861a0695_d514_4617_9720_062db08dbae7.slice/crio-a56dbdcd4d00b8585acc9e4373d17fc73b8f96aa2ec4044b83991274a6c0391b WatchSource:0}: Error finding container a56dbdcd4d00b8585acc9e4373d17fc73b8f96aa2ec4044b83991274a6c0391b: Status 404 returned error can't find the container with id a56dbdcd4d00b8585acc9e4373d17fc73b8f96aa2ec4044b83991274a6c0391b Nov 22 04:01:01 crc kubenswrapper[4952]: I1122 04:01:01.223992 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29396401-wntch" event={"ID":"861a0695-d514-4617-9720-062db08dbae7","Type":"ContainerStarted","Data":"065a8e252eb9935d4158988fabfacb430c6c58e66a19c92d39f9419c4e487916"} Nov 22 04:01:01 crc kubenswrapper[4952]: I1122 04:01:01.224270 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29396401-wntch" event={"ID":"861a0695-d514-4617-9720-062db08dbae7","Type":"ContainerStarted","Data":"a56dbdcd4d00b8585acc9e4373d17fc73b8f96aa2ec4044b83991274a6c0391b"} Nov 22 04:01:01 crc kubenswrapper[4952]: I1122 04:01:01.240836 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29396401-wntch" podStartSLOduration=1.240817616 podStartE2EDuration="1.240817616s" podCreationTimestamp="2025-11-22 04:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:01:01.238910875 +0000 UTC m=+4025.544928158" watchObservedRunningTime="2025-11-22 04:01:01.240817616 +0000 UTC m=+4025.546834889" Nov 22 04:01:03 crc kubenswrapper[4952]: I1122 04:01:03.671570 4952 scope.go:117] "RemoveContainer" containerID="ee48ed3c49323abb80c272cfa1c21945362a2efc97c721da7de93a59886f4234" Nov 22 04:01:04 crc kubenswrapper[4952]: I1122 04:01:04.255702 4952 generic.go:334] "Generic (PLEG): container finished" podID="861a0695-d514-4617-9720-062db08dbae7" containerID="065a8e252eb9935d4158988fabfacb430c6c58e66a19c92d39f9419c4e487916" exitCode=0 Nov 22 04:01:04 crc kubenswrapper[4952]: I1122 04:01:04.255754 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29396401-wntch" event={"ID":"861a0695-d514-4617-9720-062db08dbae7","Type":"ContainerDied","Data":"065a8e252eb9935d4158988fabfacb430c6c58e66a19c92d39f9419c4e487916"} Nov 22 04:01:05 crc kubenswrapper[4952]: I1122 04:01:05.732198 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29396401-wntch" Nov 22 04:01:05 crc kubenswrapper[4952]: I1122 04:01:05.895612 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861a0695-d514-4617-9720-062db08dbae7-config-data\") pod \"861a0695-d514-4617-9720-062db08dbae7\" (UID: \"861a0695-d514-4617-9720-062db08dbae7\") " Nov 22 04:01:05 crc kubenswrapper[4952]: I1122 04:01:05.895648 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdhhk\" (UniqueName: \"kubernetes.io/projected/861a0695-d514-4617-9720-062db08dbae7-kube-api-access-cdhhk\") pod \"861a0695-d514-4617-9720-062db08dbae7\" (UID: \"861a0695-d514-4617-9720-062db08dbae7\") " Nov 22 04:01:05 crc kubenswrapper[4952]: I1122 04:01:05.895832 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/861a0695-d514-4617-9720-062db08dbae7-fernet-keys\") pod \"861a0695-d514-4617-9720-062db08dbae7\" (UID: \"861a0695-d514-4617-9720-062db08dbae7\") " Nov 22 04:01:05 crc kubenswrapper[4952]: I1122 04:01:05.895911 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861a0695-d514-4617-9720-062db08dbae7-combined-ca-bundle\") pod \"861a0695-d514-4617-9720-062db08dbae7\" (UID: \"861a0695-d514-4617-9720-062db08dbae7\") " Nov 22 04:01:05 crc kubenswrapper[4952]: I1122 04:01:05.904050 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/861a0695-d514-4617-9720-062db08dbae7-kube-api-access-cdhhk" (OuterVolumeSpecName: "kube-api-access-cdhhk") pod "861a0695-d514-4617-9720-062db08dbae7" (UID: "861a0695-d514-4617-9720-062db08dbae7"). InnerVolumeSpecName "kube-api-access-cdhhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:01:05 crc kubenswrapper[4952]: I1122 04:01:05.913025 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861a0695-d514-4617-9720-062db08dbae7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "861a0695-d514-4617-9720-062db08dbae7" (UID: "861a0695-d514-4617-9720-062db08dbae7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:01:05 crc kubenswrapper[4952]: I1122 04:01:05.941951 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861a0695-d514-4617-9720-062db08dbae7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "861a0695-d514-4617-9720-062db08dbae7" (UID: "861a0695-d514-4617-9720-062db08dbae7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:01:05 crc kubenswrapper[4952]: I1122 04:01:05.953106 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861a0695-d514-4617-9720-062db08dbae7-config-data" (OuterVolumeSpecName: "config-data") pod "861a0695-d514-4617-9720-062db08dbae7" (UID: "861a0695-d514-4617-9720-062db08dbae7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:01:06 crc kubenswrapper[4952]: I1122 04:01:06.000369 4952 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/861a0695-d514-4617-9720-062db08dbae7-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:01:06 crc kubenswrapper[4952]: I1122 04:01:06.000615 4952 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861a0695-d514-4617-9720-062db08dbae7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:01:06 crc kubenswrapper[4952]: I1122 04:01:06.000791 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861a0695-d514-4617-9720-062db08dbae7-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:01:06 crc kubenswrapper[4952]: I1122 04:01:06.000956 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdhhk\" (UniqueName: \"kubernetes.io/projected/861a0695-d514-4617-9720-062db08dbae7-kube-api-access-cdhhk\") on node \"crc\" DevicePath \"\"" Nov 22 04:01:06 crc kubenswrapper[4952]: I1122 04:01:06.277611 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29396401-wntch" event={"ID":"861a0695-d514-4617-9720-062db08dbae7","Type":"ContainerDied","Data":"a56dbdcd4d00b8585acc9e4373d17fc73b8f96aa2ec4044b83991274a6c0391b"} Nov 22 04:01:06 crc kubenswrapper[4952]: I1122 04:01:06.278018 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a56dbdcd4d00b8585acc9e4373d17fc73b8f96aa2ec4044b83991274a6c0391b" Nov 22 04:01:06 crc kubenswrapper[4952]: I1122 04:01:06.277712 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29396401-wntch" Nov 22 04:01:28 crc kubenswrapper[4952]: I1122 04:01:28.341782 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:01:28 crc kubenswrapper[4952]: I1122 04:01:28.342272 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:01:58 crc kubenswrapper[4952]: I1122 04:01:58.341766 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:01:58 crc kubenswrapper[4952]: I1122 04:01:58.342322 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:01:58 crc kubenswrapper[4952]: I1122 04:01:58.342425 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 04:01:58 crc kubenswrapper[4952]: I1122 04:01:58.343081 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"93216e93c086c713d83ad7c832be17e827e54ceb3d8d0bde6d6279853a9d4a00"} pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:01:58 crc kubenswrapper[4952]: I1122 04:01:58.343140 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" containerID="cri-o://93216e93c086c713d83ad7c832be17e827e54ceb3d8d0bde6d6279853a9d4a00" gracePeriod=600 Nov 22 04:01:58 crc kubenswrapper[4952]: I1122 04:01:58.796938 4952 generic.go:334] "Generic (PLEG): container finished" podID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerID="93216e93c086c713d83ad7c832be17e827e54ceb3d8d0bde6d6279853a9d4a00" exitCode=0 Nov 22 04:01:58 crc kubenswrapper[4952]: I1122 04:01:58.797157 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerDied","Data":"93216e93c086c713d83ad7c832be17e827e54ceb3d8d0bde6d6279853a9d4a00"} Nov 22 04:01:58 crc kubenswrapper[4952]: I1122 04:01:58.797299 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5"} Nov 22 04:01:58 crc kubenswrapper[4952]: I1122 04:01:58.797327 4952 scope.go:117] "RemoveContainer" containerID="b95e6fecf45f4076f44fa4277bb70d7365a597b45ec2af84c8853029e5643038" Nov 22 04:03:30 crc kubenswrapper[4952]: I1122 04:03:30.114237 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bq4mv"] Nov 22 04:03:30 crc kubenswrapper[4952]: E1122 04:03:30.115308 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861a0695-d514-4617-9720-062db08dbae7" containerName="keystone-cron" Nov 22 04:03:30 crc kubenswrapper[4952]: I1122 04:03:30.115324 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="861a0695-d514-4617-9720-062db08dbae7" containerName="keystone-cron" Nov 22 04:03:30 crc kubenswrapper[4952]: I1122 04:03:30.115674 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="861a0695-d514-4617-9720-062db08dbae7" containerName="keystone-cron" Nov 22 04:03:30 crc kubenswrapper[4952]: I1122 04:03:30.117557 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bq4mv" Nov 22 04:03:30 crc kubenswrapper[4952]: I1122 04:03:30.147623 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bq4mv"] Nov 22 04:03:30 crc kubenswrapper[4952]: I1122 04:03:30.240375 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dc79b42-d4f8-4ff6-8118-fba0e1244f66-utilities\") pod \"redhat-operators-bq4mv\" (UID: \"5dc79b42-d4f8-4ff6-8118-fba0e1244f66\") " pod="openshift-marketplace/redhat-operators-bq4mv" Nov 22 04:03:30 crc kubenswrapper[4952]: I1122 04:03:30.240610 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77gjc\" (UniqueName: \"kubernetes.io/projected/5dc79b42-d4f8-4ff6-8118-fba0e1244f66-kube-api-access-77gjc\") pod \"redhat-operators-bq4mv\" (UID: \"5dc79b42-d4f8-4ff6-8118-fba0e1244f66\") " pod="openshift-marketplace/redhat-operators-bq4mv" Nov 22 04:03:30 crc kubenswrapper[4952]: I1122 04:03:30.240739 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dc79b42-d4f8-4ff6-8118-fba0e1244f66-catalog-content\") pod \"redhat-operators-bq4mv\" (UID: \"5dc79b42-d4f8-4ff6-8118-fba0e1244f66\") " pod="openshift-marketplace/redhat-operators-bq4mv" Nov 22 04:03:30 crc kubenswrapper[4952]: I1122 04:03:30.343192 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dc79b42-d4f8-4ff6-8118-fba0e1244f66-utilities\") pod \"redhat-operators-bq4mv\" (UID: \"5dc79b42-d4f8-4ff6-8118-fba0e1244f66\") " pod="openshift-marketplace/redhat-operators-bq4mv" Nov 22 04:03:30 crc kubenswrapper[4952]: I1122 04:03:30.343261 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77gjc\" (UniqueName: \"kubernetes.io/projected/5dc79b42-d4f8-4ff6-8118-fba0e1244f66-kube-api-access-77gjc\") pod \"redhat-operators-bq4mv\" (UID: \"5dc79b42-d4f8-4ff6-8118-fba0e1244f66\") " pod="openshift-marketplace/redhat-operators-bq4mv" Nov 22 04:03:30 crc kubenswrapper[4952]: I1122 04:03:30.343309 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dc79b42-d4f8-4ff6-8118-fba0e1244f66-catalog-content\") pod \"redhat-operators-bq4mv\" (UID: \"5dc79b42-d4f8-4ff6-8118-fba0e1244f66\") " pod="openshift-marketplace/redhat-operators-bq4mv" Nov 22 04:03:30 crc kubenswrapper[4952]: I1122 04:03:30.344098 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dc79b42-d4f8-4ff6-8118-fba0e1244f66-catalog-content\") pod \"redhat-operators-bq4mv\" (UID: \"5dc79b42-d4f8-4ff6-8118-fba0e1244f66\") " pod="openshift-marketplace/redhat-operators-bq4mv" Nov 22 04:03:30 crc kubenswrapper[4952]: I1122 04:03:30.344117 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dc79b42-d4f8-4ff6-8118-fba0e1244f66-utilities\") pod \"redhat-operators-bq4mv\" (UID: \"5dc79b42-d4f8-4ff6-8118-fba0e1244f66\") " pod="openshift-marketplace/redhat-operators-bq4mv" Nov 22 04:03:30 crc kubenswrapper[4952]: I1122 04:03:30.368027 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77gjc\" (UniqueName: \"kubernetes.io/projected/5dc79b42-d4f8-4ff6-8118-fba0e1244f66-kube-api-access-77gjc\") pod \"redhat-operators-bq4mv\" (UID: \"5dc79b42-d4f8-4ff6-8118-fba0e1244f66\") " pod="openshift-marketplace/redhat-operators-bq4mv" Nov 22 04:03:30 crc kubenswrapper[4952]: I1122 04:03:30.460431 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bq4mv" Nov 22 04:03:31 crc kubenswrapper[4952]: I1122 04:03:31.036219 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bq4mv"] Nov 22 04:03:31 crc kubenswrapper[4952]: I1122 04:03:31.796027 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bq4mv" event={"ID":"5dc79b42-d4f8-4ff6-8118-fba0e1244f66","Type":"ContainerStarted","Data":"ebe8d8e6b4dd7ba2ba4dddf2d12019bc97614cfe32b052b62f4f9e263c3a1f1e"} Nov 22 04:03:33 crc kubenswrapper[4952]: I1122 04:03:32.813081 4952 generic.go:334] "Generic (PLEG): container finished" podID="5dc79b42-d4f8-4ff6-8118-fba0e1244f66" containerID="ed28a26dc77fb9239a131b14b537703947bfb3130d4cdee89491cac83a32d0d9" exitCode=0 Nov 22 04:03:33 crc kubenswrapper[4952]: I1122 04:03:32.813263 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bq4mv" event={"ID":"5dc79b42-d4f8-4ff6-8118-fba0e1244f66","Type":"ContainerDied","Data":"ed28a26dc77fb9239a131b14b537703947bfb3130d4cdee89491cac83a32d0d9"} Nov 22 04:03:36 crc kubenswrapper[4952]: I1122 04:03:36.867085 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bq4mv" event={"ID":"5dc79b42-d4f8-4ff6-8118-fba0e1244f66","Type":"ContainerStarted","Data":"b21419b318839ae0ba0aa18b9ad10076842f56641e27f36c25dc5ab8fcc2fd56"} Nov 22 04:03:52 crc kubenswrapper[4952]: I1122 04:03:52.051321 4952 generic.go:334] "Generic (PLEG): container finished" podID="5dc79b42-d4f8-4ff6-8118-fba0e1244f66" containerID="b21419b318839ae0ba0aa18b9ad10076842f56641e27f36c25dc5ab8fcc2fd56" exitCode=0 Nov 22 04:03:52 crc kubenswrapper[4952]: I1122 04:03:52.051375 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bq4mv" event={"ID":"5dc79b42-d4f8-4ff6-8118-fba0e1244f66","Type":"ContainerDied","Data":"b21419b318839ae0ba0aa18b9ad10076842f56641e27f36c25dc5ab8fcc2fd56"} Nov 22 04:03:53 crc kubenswrapper[4952]: I1122 04:03:53.061830 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bq4mv" event={"ID":"5dc79b42-d4f8-4ff6-8118-fba0e1244f66","Type":"ContainerStarted","Data":"1ce6e9f34d17c65f2da9f43a495a137c2dd2e49623f5217ad56ac49f9eabe77f"} Nov 22 04:03:53 crc kubenswrapper[4952]: I1122 04:03:53.085121 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bq4mv" podStartSLOduration=3.385882494 podStartE2EDuration="23.085104493s" podCreationTimestamp="2025-11-22 04:03:30 +0000 UTC" firstStartedPulling="2025-11-22 04:03:32.816133068 +0000 UTC m=+4177.122150341" lastFinishedPulling="2025-11-22 04:03:52.515355067 +0000 UTC m=+4196.821372340" observedRunningTime="2025-11-22 04:03:53.080031938 +0000 UTC m=+4197.386049211" watchObservedRunningTime="2025-11-22 04:03:53.085104493 +0000 UTC m=+4197.391121766" Nov 22 04:03:58 crc kubenswrapper[4952]: I1122 04:03:58.341741 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:03:58 crc kubenswrapper[4952]: I1122 04:03:58.342279 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:04:00 crc kubenswrapper[4952]: I1122 04:04:00.460748 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bq4mv" Nov 22 04:04:00 crc kubenswrapper[4952]: I1122 04:04:00.461033 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bq4mv" Nov 22 04:04:00 crc kubenswrapper[4952]: I1122 04:04:00.513769 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bq4mv" Nov 22 04:04:01 crc kubenswrapper[4952]: I1122 04:04:01.277497 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bq4mv" Nov 22 04:04:01 crc kubenswrapper[4952]: I1122 04:04:01.338648 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bq4mv"] Nov 22 04:04:03 crc kubenswrapper[4952]: I1122 04:04:03.210582 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bq4mv" podUID="5dc79b42-d4f8-4ff6-8118-fba0e1244f66" containerName="registry-server" containerID="cri-o://1ce6e9f34d17c65f2da9f43a495a137c2dd2e49623f5217ad56ac49f9eabe77f" gracePeriod=2 Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.176621 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bq4mv" Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.233887 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bq4mv" Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.233953 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bq4mv" event={"ID":"5dc79b42-d4f8-4ff6-8118-fba0e1244f66","Type":"ContainerDied","Data":"1ce6e9f34d17c65f2da9f43a495a137c2dd2e49623f5217ad56ac49f9eabe77f"} Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.233806 4952 generic.go:334] "Generic (PLEG): container finished" podID="5dc79b42-d4f8-4ff6-8118-fba0e1244f66" containerID="1ce6e9f34d17c65f2da9f43a495a137c2dd2e49623f5217ad56ac49f9eabe77f" exitCode=0 Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.234092 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bq4mv" event={"ID":"5dc79b42-d4f8-4ff6-8118-fba0e1244f66","Type":"ContainerDied","Data":"ebe8d8e6b4dd7ba2ba4dddf2d12019bc97614cfe32b052b62f4f9e263c3a1f1e"} Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.234039 4952 scope.go:117] "RemoveContainer" containerID="1ce6e9f34d17c65f2da9f43a495a137c2dd2e49623f5217ad56ac49f9eabe77f" Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.257527 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dc79b42-d4f8-4ff6-8118-fba0e1244f66-utilities\") pod \"5dc79b42-d4f8-4ff6-8118-fba0e1244f66\" (UID: \"5dc79b42-d4f8-4ff6-8118-fba0e1244f66\") " Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.257649 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77gjc\" (UniqueName: \"kubernetes.io/projected/5dc79b42-d4f8-4ff6-8118-fba0e1244f66-kube-api-access-77gjc\") pod \"5dc79b42-d4f8-4ff6-8118-fba0e1244f66\" (UID: \"5dc79b42-d4f8-4ff6-8118-fba0e1244f66\") " Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.257672 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dc79b42-d4f8-4ff6-8118-fba0e1244f66-catalog-content\") pod \"5dc79b42-d4f8-4ff6-8118-fba0e1244f66\" (UID: \"5dc79b42-d4f8-4ff6-8118-fba0e1244f66\") " Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.259385 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dc79b42-d4f8-4ff6-8118-fba0e1244f66-utilities" (OuterVolumeSpecName: "utilities") pod "5dc79b42-d4f8-4ff6-8118-fba0e1244f66" (UID: "5dc79b42-d4f8-4ff6-8118-fba0e1244f66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.263127 4952 scope.go:117] "RemoveContainer" containerID="b21419b318839ae0ba0aa18b9ad10076842f56641e27f36c25dc5ab8fcc2fd56" Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.265702 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dc79b42-d4f8-4ff6-8118-fba0e1244f66-kube-api-access-77gjc" (OuterVolumeSpecName: "kube-api-access-77gjc") pod "5dc79b42-d4f8-4ff6-8118-fba0e1244f66" (UID: "5dc79b42-d4f8-4ff6-8118-fba0e1244f66"). InnerVolumeSpecName "kube-api-access-77gjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.348704 4952 scope.go:117] "RemoveContainer" containerID="ed28a26dc77fb9239a131b14b537703947bfb3130d4cdee89491cac83a32d0d9" Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.360295 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dc79b42-d4f8-4ff6-8118-fba0e1244f66-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.360331 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77gjc\" (UniqueName: \"kubernetes.io/projected/5dc79b42-d4f8-4ff6-8118-fba0e1244f66-kube-api-access-77gjc\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.366152 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dc79b42-d4f8-4ff6-8118-fba0e1244f66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5dc79b42-d4f8-4ff6-8118-fba0e1244f66" (UID: "5dc79b42-d4f8-4ff6-8118-fba0e1244f66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.386593 4952 scope.go:117] "RemoveContainer" containerID="1ce6e9f34d17c65f2da9f43a495a137c2dd2e49623f5217ad56ac49f9eabe77f" Nov 22 04:04:04 crc kubenswrapper[4952]: E1122 04:04:04.390280 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ce6e9f34d17c65f2da9f43a495a137c2dd2e49623f5217ad56ac49f9eabe77f\": container with ID starting with 1ce6e9f34d17c65f2da9f43a495a137c2dd2e49623f5217ad56ac49f9eabe77f not found: ID does not exist" containerID="1ce6e9f34d17c65f2da9f43a495a137c2dd2e49623f5217ad56ac49f9eabe77f" Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.390435 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ce6e9f34d17c65f2da9f43a495a137c2dd2e49623f5217ad56ac49f9eabe77f"} err="failed to get container status \"1ce6e9f34d17c65f2da9f43a495a137c2dd2e49623f5217ad56ac49f9eabe77f\": rpc error: code = NotFound desc = could not find container \"1ce6e9f34d17c65f2da9f43a495a137c2dd2e49623f5217ad56ac49f9eabe77f\": container with ID starting with 1ce6e9f34d17c65f2da9f43a495a137c2dd2e49623f5217ad56ac49f9eabe77f not found: ID does not exist" Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.390471 4952 scope.go:117] "RemoveContainer" containerID="b21419b318839ae0ba0aa18b9ad10076842f56641e27f36c25dc5ab8fcc2fd56" Nov 22 04:04:04 crc kubenswrapper[4952]: E1122 04:04:04.391748 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b21419b318839ae0ba0aa18b9ad10076842f56641e27f36c25dc5ab8fcc2fd56\": container with ID starting with b21419b318839ae0ba0aa18b9ad10076842f56641e27f36c25dc5ab8fcc2fd56 not found: ID does not exist" containerID="b21419b318839ae0ba0aa18b9ad10076842f56641e27f36c25dc5ab8fcc2fd56" Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.391833 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b21419b318839ae0ba0aa18b9ad10076842f56641e27f36c25dc5ab8fcc2fd56"} err="failed to get container status \"b21419b318839ae0ba0aa18b9ad10076842f56641e27f36c25dc5ab8fcc2fd56\": rpc error: code = NotFound desc = could not find container \"b21419b318839ae0ba0aa18b9ad10076842f56641e27f36c25dc5ab8fcc2fd56\": container with ID starting with b21419b318839ae0ba0aa18b9ad10076842f56641e27f36c25dc5ab8fcc2fd56 not found: ID does not exist" Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.391887 4952 scope.go:117] "RemoveContainer" containerID="ed28a26dc77fb9239a131b14b537703947bfb3130d4cdee89491cac83a32d0d9" Nov 22 04:04:04 crc kubenswrapper[4952]: E1122 04:04:04.392356 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed28a26dc77fb9239a131b14b537703947bfb3130d4cdee89491cac83a32d0d9\": container with ID starting with ed28a26dc77fb9239a131b14b537703947bfb3130d4cdee89491cac83a32d0d9 not found: ID does not exist" containerID="ed28a26dc77fb9239a131b14b537703947bfb3130d4cdee89491cac83a32d0d9" Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.392415 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed28a26dc77fb9239a131b14b537703947bfb3130d4cdee89491cac83a32d0d9"} err="failed to get container status \"ed28a26dc77fb9239a131b14b537703947bfb3130d4cdee89491cac83a32d0d9\": rpc error: code = NotFound desc = could not find container \"ed28a26dc77fb9239a131b14b537703947bfb3130d4cdee89491cac83a32d0d9\": container with ID starting with ed28a26dc77fb9239a131b14b537703947bfb3130d4cdee89491cac83a32d0d9 not found: ID does not exist" Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.462557 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dc79b42-d4f8-4ff6-8118-fba0e1244f66-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.584342 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bq4mv"] Nov 22 04:04:04 crc kubenswrapper[4952]: I1122 04:04:04.594072 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bq4mv"] Nov 22 04:04:06 crc kubenswrapper[4952]: I1122 04:04:06.548519 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dc79b42-d4f8-4ff6-8118-fba0e1244f66" path="/var/lib/kubelet/pods/5dc79b42-d4f8-4ff6-8118-fba0e1244f66/volumes" Nov 22 04:04:28 crc kubenswrapper[4952]: I1122 04:04:28.341942 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:04:28 crc kubenswrapper[4952]: I1122 04:04:28.343638 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:04:58 crc kubenswrapper[4952]: I1122 04:04:58.342791 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:04:58 crc kubenswrapper[4952]: I1122 04:04:58.343327 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:04:58 crc kubenswrapper[4952]: I1122 04:04:58.343371 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 04:04:58 crc kubenswrapper[4952]: I1122 04:04:58.344138 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5"} pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:04:58 crc kubenswrapper[4952]: I1122 04:04:58.344200 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" containerID="cri-o://d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" gracePeriod=600 Nov 22 04:04:58 crc kubenswrapper[4952]: E1122 04:04:58.474163 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:04:58 crc kubenswrapper[4952]: I1122 04:04:58.861245 4952 generic.go:334] "Generic (PLEG): container finished" podID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" exitCode=0 Nov 22 04:04:58 crc kubenswrapper[4952]: I1122 04:04:58.861459 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerDied","Data":"d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5"} Nov 22 04:04:58 crc kubenswrapper[4952]: I1122 04:04:58.861601 4952 scope.go:117] "RemoveContainer" containerID="93216e93c086c713d83ad7c832be17e827e54ceb3d8d0bde6d6279853a9d4a00" Nov 22 04:04:58 crc kubenswrapper[4952]: I1122 04:04:58.862306 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:04:58 crc kubenswrapper[4952]: E1122 04:04:58.862572 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:05:09 crc kubenswrapper[4952]: I1122 04:05:09.531360 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:05:09 crc kubenswrapper[4952]: E1122 04:05:09.532275 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:05:21 crc kubenswrapper[4952]: I1122 04:05:21.531414 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:05:21 crc kubenswrapper[4952]: E1122 04:05:21.532462 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:05:32 crc kubenswrapper[4952]: I1122 04:05:32.531872 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:05:32 crc kubenswrapper[4952]: E1122 04:05:32.534209 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:05:45 crc kubenswrapper[4952]: I1122 04:05:45.531789 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:05:45 crc kubenswrapper[4952]: E1122 04:05:45.533030 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:06:00 crc kubenswrapper[4952]: I1122 04:06:00.531659 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:06:00 crc kubenswrapper[4952]: E1122 04:06:00.532660 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:06:06 crc kubenswrapper[4952]: I1122 04:06:06.871851 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-crssg"] Nov 22 04:06:06 crc kubenswrapper[4952]: E1122 04:06:06.873078 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc79b42-d4f8-4ff6-8118-fba0e1244f66" containerName="extract-utilities" Nov 22 04:06:06 crc kubenswrapper[4952]: I1122 04:06:06.873094 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc79b42-d4f8-4ff6-8118-fba0e1244f66" containerName="extract-utilities" Nov 22 04:06:06 crc kubenswrapper[4952]: E1122 04:06:06.873115 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc79b42-d4f8-4ff6-8118-fba0e1244f66" containerName="registry-server" Nov 22 04:06:06 crc kubenswrapper[4952]: I1122 04:06:06.873121 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc79b42-d4f8-4ff6-8118-fba0e1244f66" containerName="registry-server" Nov 22 04:06:06 crc kubenswrapper[4952]: E1122 04:06:06.873146 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc79b42-d4f8-4ff6-8118-fba0e1244f66" containerName="extract-content" Nov 22 04:06:06 crc kubenswrapper[4952]: I1122 04:06:06.873154 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc79b42-d4f8-4ff6-8118-fba0e1244f66" containerName="extract-content" Nov 22 04:06:06 crc kubenswrapper[4952]: I1122 04:06:06.873352 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc79b42-d4f8-4ff6-8118-fba0e1244f66" containerName="registry-server" Nov 22 04:06:06 crc kubenswrapper[4952]: I1122 04:06:06.874854 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-crssg" Nov 22 04:06:06 crc kubenswrapper[4952]: I1122 04:06:06.900950 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-crssg"] Nov 22 04:06:07 crc kubenswrapper[4952]: I1122 04:06:07.056948 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh4t9\" (UniqueName: \"kubernetes.io/projected/f6f186d6-a112-4b07-ac4f-4bdcc0911cde-kube-api-access-gh4t9\") pod \"community-operators-crssg\" (UID: \"f6f186d6-a112-4b07-ac4f-4bdcc0911cde\") " pod="openshift-marketplace/community-operators-crssg" Nov 22 04:06:07 crc kubenswrapper[4952]: I1122 04:06:07.057286 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f186d6-a112-4b07-ac4f-4bdcc0911cde-catalog-content\") pod \"community-operators-crssg\" (UID: \"f6f186d6-a112-4b07-ac4f-4bdcc0911cde\") " pod="openshift-marketplace/community-operators-crssg" Nov 22 04:06:07 crc kubenswrapper[4952]: I1122 04:06:07.057322 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f186d6-a112-4b07-ac4f-4bdcc0911cde-utilities\") pod \"community-operators-crssg\" (UID: \"f6f186d6-a112-4b07-ac4f-4bdcc0911cde\") " pod="openshift-marketplace/community-operators-crssg" Nov 22 04:06:07 crc kubenswrapper[4952]: I1122 04:06:07.158764 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh4t9\" (UniqueName: \"kubernetes.io/projected/f6f186d6-a112-4b07-ac4f-4bdcc0911cde-kube-api-access-gh4t9\") pod \"community-operators-crssg\" (UID: \"f6f186d6-a112-4b07-ac4f-4bdcc0911cde\") " pod="openshift-marketplace/community-operators-crssg" Nov 22 04:06:07 crc kubenswrapper[4952]: I1122 04:06:07.158874 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f186d6-a112-4b07-ac4f-4bdcc0911cde-catalog-content\") pod \"community-operators-crssg\" (UID: \"f6f186d6-a112-4b07-ac4f-4bdcc0911cde\") " pod="openshift-marketplace/community-operators-crssg" Nov 22 04:06:07 crc kubenswrapper[4952]: I1122 04:06:07.158909 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f186d6-a112-4b07-ac4f-4bdcc0911cde-utilities\") pod \"community-operators-crssg\" (UID: \"f6f186d6-a112-4b07-ac4f-4bdcc0911cde\") " pod="openshift-marketplace/community-operators-crssg" Nov 22 04:06:07 crc kubenswrapper[4952]: I1122 04:06:07.159392 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f186d6-a112-4b07-ac4f-4bdcc0911cde-catalog-content\") pod \"community-operators-crssg\" (UID: \"f6f186d6-a112-4b07-ac4f-4bdcc0911cde\") " pod="openshift-marketplace/community-operators-crssg" Nov 22 04:06:07 crc kubenswrapper[4952]: I1122 04:06:07.159896 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f186d6-a112-4b07-ac4f-4bdcc0911cde-utilities\") pod \"community-operators-crssg\" (UID: \"f6f186d6-a112-4b07-ac4f-4bdcc0911cde\") " pod="openshift-marketplace/community-operators-crssg" Nov 22 04:06:07 crc kubenswrapper[4952]: I1122 04:06:07.206088 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh4t9\" (UniqueName: \"kubernetes.io/projected/f6f186d6-a112-4b07-ac4f-4bdcc0911cde-kube-api-access-gh4t9\") pod \"community-operators-crssg\" (UID: \"f6f186d6-a112-4b07-ac4f-4bdcc0911cde\") " pod="openshift-marketplace/community-operators-crssg" Nov 22 04:06:07 crc kubenswrapper[4952]: I1122 04:06:07.501506 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-crssg" Nov 22 04:06:07 crc kubenswrapper[4952]: I1122 04:06:07.938221 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-crssg"] Nov 22 04:06:08 crc kubenswrapper[4952]: I1122 04:06:08.627700 4952 generic.go:334] "Generic (PLEG): container finished" podID="f6f186d6-a112-4b07-ac4f-4bdcc0911cde" containerID="23c2591d50e17f9d40882a83f3931bcda52150e5568550de0ab4771388227aeb" exitCode=0 Nov 22 04:06:08 crc kubenswrapper[4952]: I1122 04:06:08.627777 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crssg" event={"ID":"f6f186d6-a112-4b07-ac4f-4bdcc0911cde","Type":"ContainerDied","Data":"23c2591d50e17f9d40882a83f3931bcda52150e5568550de0ab4771388227aeb"} Nov 22 04:06:08 crc kubenswrapper[4952]: I1122 04:06:08.628627 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crssg" event={"ID":"f6f186d6-a112-4b07-ac4f-4bdcc0911cde","Type":"ContainerStarted","Data":"a6cbe4b5c224a69f1dc88c8bd6fdc96c6458784443e81edbaebf9642481bb778"} Nov 22 04:06:08 crc kubenswrapper[4952]: I1122 04:06:08.632901 4952 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:06:10 crc kubenswrapper[4952]: I1122 04:06:10.650234 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crssg" event={"ID":"f6f186d6-a112-4b07-ac4f-4bdcc0911cde","Type":"ContainerStarted","Data":"f58bd361c818c1bbff8c471d90d6d54d1c3c8bcf0d7a476bbcc8bfd2939e3f06"} Nov 22 04:06:11 crc kubenswrapper[4952]: I1122 04:06:11.531027 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:06:11 crc kubenswrapper[4952]: E1122 04:06:11.531380 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:06:14 crc kubenswrapper[4952]: I1122 04:06:14.720422 4952 generic.go:334] "Generic (PLEG): container finished" podID="f6f186d6-a112-4b07-ac4f-4bdcc0911cde" containerID="f58bd361c818c1bbff8c471d90d6d54d1c3c8bcf0d7a476bbcc8bfd2939e3f06" exitCode=0 Nov 22 04:06:14 crc kubenswrapper[4952]: I1122 04:06:14.720913 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crssg" event={"ID":"f6f186d6-a112-4b07-ac4f-4bdcc0911cde","Type":"ContainerDied","Data":"f58bd361c818c1bbff8c471d90d6d54d1c3c8bcf0d7a476bbcc8bfd2939e3f06"} Nov 22 04:06:15 crc kubenswrapper[4952]: I1122 04:06:15.740104 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crssg" event={"ID":"f6f186d6-a112-4b07-ac4f-4bdcc0911cde","Type":"ContainerStarted","Data":"e304429a45b6973f0b9c520293068f4c1347f42d3a1425663e6532b5ed36600c"} Nov 22 04:06:15 crc kubenswrapper[4952]: I1122 04:06:15.764905 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-crssg" podStartSLOduration=3.149312923 podStartE2EDuration="9.764881216s" podCreationTimestamp="2025-11-22 04:06:06 +0000 UTC" firstStartedPulling="2025-11-22 04:06:08.632659644 +0000 UTC m=+4332.938676907" lastFinishedPulling="2025-11-22 04:06:15.248227887 +0000 UTC m=+4339.554245200" observedRunningTime="2025-11-22 04:06:15.763663163 +0000 UTC m=+4340.069680486" watchObservedRunningTime="2025-11-22 04:06:15.764881216 +0000 UTC m=+4340.070898499" Nov 22 04:06:17 crc kubenswrapper[4952]: I1122 04:06:17.502894 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-crssg" Nov 22 04:06:17 crc kubenswrapper[4952]: I1122 04:06:17.503486 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-crssg" Nov 22 04:06:17 crc kubenswrapper[4952]: I1122 04:06:17.559310 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-crssg" Nov 22 04:06:22 crc kubenswrapper[4952]: I1122 04:06:22.531797 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:06:22 crc kubenswrapper[4952]: E1122 04:06:22.532534 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:06:27 crc kubenswrapper[4952]: I1122 04:06:27.554342 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-crssg" Nov 22 04:06:27 crc kubenswrapper[4952]: I1122 04:06:27.620634 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-crssg"] Nov 22 04:06:27 crc kubenswrapper[4952]: I1122 04:06:27.875498 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-crssg" podUID="f6f186d6-a112-4b07-ac4f-4bdcc0911cde" containerName="registry-server" containerID="cri-o://e304429a45b6973f0b9c520293068f4c1347f42d3a1425663e6532b5ed36600c" gracePeriod=2 Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.469670 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-crssg" Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.635225 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f186d6-a112-4b07-ac4f-4bdcc0911cde-catalog-content\") pod \"f6f186d6-a112-4b07-ac4f-4bdcc0911cde\" (UID: \"f6f186d6-a112-4b07-ac4f-4bdcc0911cde\") " Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.635273 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh4t9\" (UniqueName: \"kubernetes.io/projected/f6f186d6-a112-4b07-ac4f-4bdcc0911cde-kube-api-access-gh4t9\") pod \"f6f186d6-a112-4b07-ac4f-4bdcc0911cde\" (UID: \"f6f186d6-a112-4b07-ac4f-4bdcc0911cde\") " Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.635294 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f186d6-a112-4b07-ac4f-4bdcc0911cde-utilities\") pod \"f6f186d6-a112-4b07-ac4f-4bdcc0911cde\" (UID: \"f6f186d6-a112-4b07-ac4f-4bdcc0911cde\") " Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.637509 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6f186d6-a112-4b07-ac4f-4bdcc0911cde-utilities" (OuterVolumeSpecName: "utilities") pod "f6f186d6-a112-4b07-ac4f-4bdcc0911cde" (UID: "f6f186d6-a112-4b07-ac4f-4bdcc0911cde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.664729 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6f186d6-a112-4b07-ac4f-4bdcc0911cde-kube-api-access-gh4t9" (OuterVolumeSpecName: "kube-api-access-gh4t9") pod "f6f186d6-a112-4b07-ac4f-4bdcc0911cde" (UID: "f6f186d6-a112-4b07-ac4f-4bdcc0911cde"). InnerVolumeSpecName "kube-api-access-gh4t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.690875 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6f186d6-a112-4b07-ac4f-4bdcc0911cde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6f186d6-a112-4b07-ac4f-4bdcc0911cde" (UID: "f6f186d6-a112-4b07-ac4f-4bdcc0911cde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.737696 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f186d6-a112-4b07-ac4f-4bdcc0911cde-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.737907 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f186d6-a112-4b07-ac4f-4bdcc0911cde-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.737966 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh4t9\" (UniqueName: \"kubernetes.io/projected/f6f186d6-a112-4b07-ac4f-4bdcc0911cde-kube-api-access-gh4t9\") on node \"crc\" DevicePath \"\"" Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.890154 4952 generic.go:334] "Generic (PLEG): container finished" podID="f6f186d6-a112-4b07-ac4f-4bdcc0911cde" containerID="e304429a45b6973f0b9c520293068f4c1347f42d3a1425663e6532b5ed36600c" exitCode=0 Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.890225 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-crssg" Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.890255 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crssg" event={"ID":"f6f186d6-a112-4b07-ac4f-4bdcc0911cde","Type":"ContainerDied","Data":"e304429a45b6973f0b9c520293068f4c1347f42d3a1425663e6532b5ed36600c"} Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.890656 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crssg" event={"ID":"f6f186d6-a112-4b07-ac4f-4bdcc0911cde","Type":"ContainerDied","Data":"a6cbe4b5c224a69f1dc88c8bd6fdc96c6458784443e81edbaebf9642481bb778"} Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.890680 4952 scope.go:117] "RemoveContainer" containerID="e304429a45b6973f0b9c520293068f4c1347f42d3a1425663e6532b5ed36600c" Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.917968 4952 scope.go:117] "RemoveContainer" containerID="f58bd361c818c1bbff8c471d90d6d54d1c3c8bcf0d7a476bbcc8bfd2939e3f06" Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.929511 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-crssg"] Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.941355 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-crssg"] Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.950169 4952 scope.go:117] "RemoveContainer" containerID="23c2591d50e17f9d40882a83f3931bcda52150e5568550de0ab4771388227aeb" Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.988934 4952 scope.go:117] "RemoveContainer" containerID="e304429a45b6973f0b9c520293068f4c1347f42d3a1425663e6532b5ed36600c" Nov 22 04:06:28 crc kubenswrapper[4952]: E1122 04:06:28.989770 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e304429a45b6973f0b9c520293068f4c1347f42d3a1425663e6532b5ed36600c\": container with ID starting with e304429a45b6973f0b9c520293068f4c1347f42d3a1425663e6532b5ed36600c not found: ID does not exist" containerID="e304429a45b6973f0b9c520293068f4c1347f42d3a1425663e6532b5ed36600c" Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.990305 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e304429a45b6973f0b9c520293068f4c1347f42d3a1425663e6532b5ed36600c"} err="failed to get container status \"e304429a45b6973f0b9c520293068f4c1347f42d3a1425663e6532b5ed36600c\": rpc error: code = NotFound desc = could not find container \"e304429a45b6973f0b9c520293068f4c1347f42d3a1425663e6532b5ed36600c\": container with ID starting with e304429a45b6973f0b9c520293068f4c1347f42d3a1425663e6532b5ed36600c not found: ID does not exist" Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.990342 4952 scope.go:117] "RemoveContainer" containerID="f58bd361c818c1bbff8c471d90d6d54d1c3c8bcf0d7a476bbcc8bfd2939e3f06" Nov 22 04:06:28 crc kubenswrapper[4952]: E1122 04:06:28.990884 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f58bd361c818c1bbff8c471d90d6d54d1c3c8bcf0d7a476bbcc8bfd2939e3f06\": container with ID starting with f58bd361c818c1bbff8c471d90d6d54d1c3c8bcf0d7a476bbcc8bfd2939e3f06 not found: ID does not exist" containerID="f58bd361c818c1bbff8c471d90d6d54d1c3c8bcf0d7a476bbcc8bfd2939e3f06" Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.990909 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f58bd361c818c1bbff8c471d90d6d54d1c3c8bcf0d7a476bbcc8bfd2939e3f06"} err="failed to get container status \"f58bd361c818c1bbff8c471d90d6d54d1c3c8bcf0d7a476bbcc8bfd2939e3f06\": rpc error: code = NotFound desc = could not find container \"f58bd361c818c1bbff8c471d90d6d54d1c3c8bcf0d7a476bbcc8bfd2939e3f06\": container with ID starting with f58bd361c818c1bbff8c471d90d6d54d1c3c8bcf0d7a476bbcc8bfd2939e3f06 not found: ID does not exist" Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.990934 4952 scope.go:117] "RemoveContainer" containerID="23c2591d50e17f9d40882a83f3931bcda52150e5568550de0ab4771388227aeb" Nov 22 04:06:28 crc kubenswrapper[4952]: E1122 04:06:28.991665 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23c2591d50e17f9d40882a83f3931bcda52150e5568550de0ab4771388227aeb\": container with ID starting with 23c2591d50e17f9d40882a83f3931bcda52150e5568550de0ab4771388227aeb not found: ID does not exist" containerID="23c2591d50e17f9d40882a83f3931bcda52150e5568550de0ab4771388227aeb" Nov 22 04:06:28 crc kubenswrapper[4952]: I1122 04:06:28.991682 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23c2591d50e17f9d40882a83f3931bcda52150e5568550de0ab4771388227aeb"} err="failed to get container status \"23c2591d50e17f9d40882a83f3931bcda52150e5568550de0ab4771388227aeb\": rpc error: code = NotFound desc = could not find container \"23c2591d50e17f9d40882a83f3931bcda52150e5568550de0ab4771388227aeb\": container with ID starting with 23c2591d50e17f9d40882a83f3931bcda52150e5568550de0ab4771388227aeb not found: ID does not exist" Nov 22 04:06:30 crc kubenswrapper[4952]: I1122 04:06:30.544732 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6f186d6-a112-4b07-ac4f-4bdcc0911cde" path="/var/lib/kubelet/pods/f6f186d6-a112-4b07-ac4f-4bdcc0911cde/volumes" Nov 22 04:06:37 crc kubenswrapper[4952]: I1122 04:06:37.531474 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:06:37 crc kubenswrapper[4952]: E1122 04:06:37.533014 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:06:50 crc kubenswrapper[4952]: I1122 04:06:50.532074 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:06:50 crc kubenswrapper[4952]: E1122 04:06:50.533128 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:07:01 crc kubenswrapper[4952]: I1122 04:07:01.531028 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:07:01 crc kubenswrapper[4952]: E1122 04:07:01.532008 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:07:16 crc kubenswrapper[4952]: I1122 04:07:16.538985 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:07:16 crc kubenswrapper[4952]: E1122 04:07:16.539669 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:07:28 crc kubenswrapper[4952]: I1122 04:07:28.532158 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:07:28 crc kubenswrapper[4952]: E1122 04:07:28.533170 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:07:40 crc kubenswrapper[4952]: I1122 04:07:40.531982 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:07:40 crc kubenswrapper[4952]: E1122 04:07:40.532733 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:07:55 crc kubenswrapper[4952]: I1122 04:07:55.531010 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:07:55 crc kubenswrapper[4952]: E1122 04:07:55.531806 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:08:07 crc kubenswrapper[4952]: I1122 04:08:07.530929 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:08:07 crc kubenswrapper[4952]: E1122 04:08:07.531832 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:08:20 crc kubenswrapper[4952]: I1122 04:08:20.532264 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:08:20 crc kubenswrapper[4952]: E1122 04:08:20.533135 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:08:31 crc kubenswrapper[4952]: I1122 04:08:31.531800 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:08:31 crc kubenswrapper[4952]: E1122 04:08:31.533079 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:08:43 crc kubenswrapper[4952]: I1122 04:08:43.530781 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:08:43 crc kubenswrapper[4952]: E1122 04:08:43.531503 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:08:56 crc kubenswrapper[4952]: I1122 04:08:56.538696 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:08:56 crc kubenswrapper[4952]: E1122 04:08:56.541176 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:09:09 crc kubenswrapper[4952]: I1122 04:09:09.532093 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:09:09 crc kubenswrapper[4952]: E1122 04:09:09.533090 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:09:20 crc kubenswrapper[4952]: I1122 04:09:20.911738 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xff2w"] Nov 22 04:09:20 crc kubenswrapper[4952]: E1122 04:09:20.913137 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f186d6-a112-4b07-ac4f-4bdcc0911cde" containerName="registry-server" Nov 22 04:09:20 crc kubenswrapper[4952]: I1122 04:09:20.913164 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f186d6-a112-4b07-ac4f-4bdcc0911cde" containerName="registry-server" Nov 22 04:09:20 crc kubenswrapper[4952]: E1122 04:09:20.913189 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f186d6-a112-4b07-ac4f-4bdcc0911cde" containerName="extract-content" Nov 22 04:09:20 crc kubenswrapper[4952]: I1122 04:09:20.913201 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f186d6-a112-4b07-ac4f-4bdcc0911cde" containerName="extract-content" Nov 22 04:09:20 crc kubenswrapper[4952]: E1122 04:09:20.913238 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f186d6-a112-4b07-ac4f-4bdcc0911cde" containerName="extract-utilities" Nov 22 04:09:20 crc kubenswrapper[4952]: I1122 04:09:20.913279 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f186d6-a112-4b07-ac4f-4bdcc0911cde" containerName="extract-utilities" Nov 22 04:09:20 crc kubenswrapper[4952]: I1122 04:09:20.913678 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6f186d6-a112-4b07-ac4f-4bdcc0911cde" containerName="registry-server" Nov 22 04:09:20 crc kubenswrapper[4952]: I1122 04:09:20.916014 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xff2w" Nov 22 04:09:20 crc kubenswrapper[4952]: I1122 04:09:20.925373 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xff2w"] Nov 22 04:09:20 crc kubenswrapper[4952]: I1122 04:09:20.936130 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f75633c4-435b-493a-a5b8-4fd1cb779109-catalog-content\") pod \"redhat-marketplace-xff2w\" (UID: \"f75633c4-435b-493a-a5b8-4fd1cb779109\") " pod="openshift-marketplace/redhat-marketplace-xff2w" Nov 22 04:09:20 crc kubenswrapper[4952]: I1122 04:09:20.936253 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmz5p\" (UniqueName: \"kubernetes.io/projected/f75633c4-435b-493a-a5b8-4fd1cb779109-kube-api-access-bmz5p\") pod \"redhat-marketplace-xff2w\" (UID: \"f75633c4-435b-493a-a5b8-4fd1cb779109\") " pod="openshift-marketplace/redhat-marketplace-xff2w" Nov 22 04:09:20 crc kubenswrapper[4952]: I1122 04:09:20.936703 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f75633c4-435b-493a-a5b8-4fd1cb779109-utilities\") pod \"redhat-marketplace-xff2w\" (UID: \"f75633c4-435b-493a-a5b8-4fd1cb779109\") " pod="openshift-marketplace/redhat-marketplace-xff2w" Nov 22 04:09:21 crc kubenswrapper[4952]: I1122 04:09:21.039114 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f75633c4-435b-493a-a5b8-4fd1cb779109-catalog-content\") pod \"redhat-marketplace-xff2w\" (UID: \"f75633c4-435b-493a-a5b8-4fd1cb779109\") " pod="openshift-marketplace/redhat-marketplace-xff2w" Nov 22 04:09:21 crc kubenswrapper[4952]: I1122 04:09:21.039233 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmz5p\" (UniqueName: \"kubernetes.io/projected/f75633c4-435b-493a-a5b8-4fd1cb779109-kube-api-access-bmz5p\") pod \"redhat-marketplace-xff2w\" (UID: \"f75633c4-435b-493a-a5b8-4fd1cb779109\") " pod="openshift-marketplace/redhat-marketplace-xff2w" Nov 22 04:09:21 crc kubenswrapper[4952]: I1122 04:09:21.039348 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f75633c4-435b-493a-a5b8-4fd1cb779109-utilities\") pod \"redhat-marketplace-xff2w\" (UID: \"f75633c4-435b-493a-a5b8-4fd1cb779109\") " pod="openshift-marketplace/redhat-marketplace-xff2w" Nov 22 04:09:21 crc kubenswrapper[4952]: I1122 04:09:21.039881 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f75633c4-435b-493a-a5b8-4fd1cb779109-catalog-content\") pod \"redhat-marketplace-xff2w\" (UID: \"f75633c4-435b-493a-a5b8-4fd1cb779109\") " pod="openshift-marketplace/redhat-marketplace-xff2w" Nov 22 04:09:21 crc kubenswrapper[4952]: I1122 04:09:21.039936 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f75633c4-435b-493a-a5b8-4fd1cb779109-utilities\") pod \"redhat-marketplace-xff2w\" (UID: \"f75633c4-435b-493a-a5b8-4fd1cb779109\") " pod="openshift-marketplace/redhat-marketplace-xff2w" Nov 22 04:09:21 crc kubenswrapper[4952]: I1122 04:09:21.074707 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmz5p\" (UniqueName: \"kubernetes.io/projected/f75633c4-435b-493a-a5b8-4fd1cb779109-kube-api-access-bmz5p\") pod \"redhat-marketplace-xff2w\" (UID: \"f75633c4-435b-493a-a5b8-4fd1cb779109\") " pod="openshift-marketplace/redhat-marketplace-xff2w" Nov 22 04:09:21 crc kubenswrapper[4952]: I1122 04:09:21.276020 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xff2w" Nov 22 04:09:21 crc kubenswrapper[4952]: I1122 04:09:21.790773 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xff2w"] Nov 22 04:09:22 crc kubenswrapper[4952]: I1122 04:09:22.533129 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:09:22 crc kubenswrapper[4952]: E1122 04:09:22.534202 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:09:22 crc kubenswrapper[4952]: I1122 04:09:22.675495 4952 generic.go:334] "Generic (PLEG): container finished" podID="f75633c4-435b-493a-a5b8-4fd1cb779109" containerID="4368565cf776872be00927091ee5a2c766a40ad1362eea4ab5d79843b991b476" exitCode=0 Nov 22 04:09:22 crc kubenswrapper[4952]: I1122 04:09:22.675628 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xff2w" event={"ID":"f75633c4-435b-493a-a5b8-4fd1cb779109","Type":"ContainerDied","Data":"4368565cf776872be00927091ee5a2c766a40ad1362eea4ab5d79843b991b476"} Nov 22 04:09:22 crc kubenswrapper[4952]: I1122 04:09:22.675681 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xff2w" event={"ID":"f75633c4-435b-493a-a5b8-4fd1cb779109","Type":"ContainerStarted","Data":"4753d92d5e8aeb5a2eb47a6d25f4ba0d32b95de15470a1d0bb6be69acf14cb1f"} Nov 22 04:09:24 crc kubenswrapper[4952]: I1122 04:09:24.698392 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xff2w" event={"ID":"f75633c4-435b-493a-a5b8-4fd1cb779109","Type":"ContainerStarted","Data":"b56804aba92dcffc57c86da10cfa1101fd256f1e04b7d01814b8a2a1eb634e79"} Nov 22 04:09:26 crc kubenswrapper[4952]: I1122 04:09:26.717566 4952 generic.go:334] "Generic (PLEG): container finished" podID="f75633c4-435b-493a-a5b8-4fd1cb779109" containerID="b56804aba92dcffc57c86da10cfa1101fd256f1e04b7d01814b8a2a1eb634e79" exitCode=0 Nov 22 04:09:26 crc kubenswrapper[4952]: I1122 04:09:26.717596 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xff2w" event={"ID":"f75633c4-435b-493a-a5b8-4fd1cb779109","Type":"ContainerDied","Data":"b56804aba92dcffc57c86da10cfa1101fd256f1e04b7d01814b8a2a1eb634e79"} Nov 22 04:09:27 crc kubenswrapper[4952]: I1122 04:09:27.730475 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xff2w" event={"ID":"f75633c4-435b-493a-a5b8-4fd1cb779109","Type":"ContainerStarted","Data":"f55d0fd1868eae258e6d5e5aa9aaf3db3cce26699b541266409b808d70e3d150"} Nov 22 04:09:27 crc kubenswrapper[4952]: I1122 04:09:27.757539 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xff2w" podStartSLOduration=3.245013201 podStartE2EDuration="7.757524184s" podCreationTimestamp="2025-11-22 04:09:20 +0000 UTC" firstStartedPulling="2025-11-22 04:09:22.679270601 +0000 UTC m=+4526.985287914" lastFinishedPulling="2025-11-22 04:09:27.191781624 +0000 UTC m=+4531.497798897" observedRunningTime="2025-11-22 04:09:27.755672285 +0000 UTC m=+4532.061689578" watchObservedRunningTime="2025-11-22 04:09:27.757524184 +0000 UTC m=+4532.063541457" Nov 22 04:09:31 crc kubenswrapper[4952]: I1122 04:09:31.276336 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xff2w" Nov 22 04:09:31 crc kubenswrapper[4952]: I1122 04:09:31.277522 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xff2w" Nov 22 04:09:31 crc kubenswrapper[4952]: I1122 04:09:31.346627 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xff2w" Nov 22 04:09:36 crc kubenswrapper[4952]: I1122 04:09:36.539584 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:09:36 crc kubenswrapper[4952]: E1122 04:09:36.540328 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:09:41 crc kubenswrapper[4952]: I1122 04:09:41.350956 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xff2w" Nov 22 04:09:41 crc kubenswrapper[4952]: I1122 04:09:41.414672 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xff2w"] Nov 22 04:09:41 crc kubenswrapper[4952]: I1122 04:09:41.883940 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xff2w" podUID="f75633c4-435b-493a-a5b8-4fd1cb779109" containerName="registry-server" containerID="cri-o://f55d0fd1868eae258e6d5e5aa9aaf3db3cce26699b541266409b808d70e3d150" gracePeriod=2 Nov 22 04:09:42 crc kubenswrapper[4952]: I1122 04:09:42.521799 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xff2w" Nov 22 04:09:42 crc kubenswrapper[4952]: I1122 04:09:42.646923 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f75633c4-435b-493a-a5b8-4fd1cb779109-utilities\") pod \"f75633c4-435b-493a-a5b8-4fd1cb779109\" (UID: \"f75633c4-435b-493a-a5b8-4fd1cb779109\") " Nov 22 04:09:42 crc kubenswrapper[4952]: I1122 04:09:42.647116 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f75633c4-435b-493a-a5b8-4fd1cb779109-catalog-content\") pod \"f75633c4-435b-493a-a5b8-4fd1cb779109\" (UID: \"f75633c4-435b-493a-a5b8-4fd1cb779109\") " Nov 22 04:09:42 crc kubenswrapper[4952]: I1122 04:09:42.647146 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmz5p\" (UniqueName: \"kubernetes.io/projected/f75633c4-435b-493a-a5b8-4fd1cb779109-kube-api-access-bmz5p\") pod \"f75633c4-435b-493a-a5b8-4fd1cb779109\" (UID: \"f75633c4-435b-493a-a5b8-4fd1cb779109\") " Nov 22 04:09:42 crc kubenswrapper[4952]: I1122 04:09:42.648054 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f75633c4-435b-493a-a5b8-4fd1cb779109-utilities" (OuterVolumeSpecName: "utilities") pod "f75633c4-435b-493a-a5b8-4fd1cb779109" (UID: "f75633c4-435b-493a-a5b8-4fd1cb779109"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:09:42 crc kubenswrapper[4952]: I1122 04:09:42.648464 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f75633c4-435b-493a-a5b8-4fd1cb779109-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:09:42 crc kubenswrapper[4952]: I1122 04:09:42.653198 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f75633c4-435b-493a-a5b8-4fd1cb779109-kube-api-access-bmz5p" (OuterVolumeSpecName: "kube-api-access-bmz5p") pod "f75633c4-435b-493a-a5b8-4fd1cb779109" (UID: "f75633c4-435b-493a-a5b8-4fd1cb779109"). InnerVolumeSpecName "kube-api-access-bmz5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:09:42 crc kubenswrapper[4952]: I1122 04:09:42.664852 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f75633c4-435b-493a-a5b8-4fd1cb779109-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f75633c4-435b-493a-a5b8-4fd1cb779109" (UID: "f75633c4-435b-493a-a5b8-4fd1cb779109"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:09:42 crc kubenswrapper[4952]: I1122 04:09:42.750577 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f75633c4-435b-493a-a5b8-4fd1cb779109-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:09:42 crc kubenswrapper[4952]: I1122 04:09:42.750617 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmz5p\" (UniqueName: \"kubernetes.io/projected/f75633c4-435b-493a-a5b8-4fd1cb779109-kube-api-access-bmz5p\") on node \"crc\" DevicePath \"\"" Nov 22 04:09:42 crc kubenswrapper[4952]: I1122 04:09:42.895893 4952 generic.go:334] "Generic (PLEG): container finished" podID="f75633c4-435b-493a-a5b8-4fd1cb779109" containerID="f55d0fd1868eae258e6d5e5aa9aaf3db3cce26699b541266409b808d70e3d150" exitCode=0 Nov 22 04:09:42 crc kubenswrapper[4952]: I1122 04:09:42.895935 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xff2w" Nov 22 04:09:42 crc kubenswrapper[4952]: I1122 04:09:42.895943 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xff2w" event={"ID":"f75633c4-435b-493a-a5b8-4fd1cb779109","Type":"ContainerDied","Data":"f55d0fd1868eae258e6d5e5aa9aaf3db3cce26699b541266409b808d70e3d150"} Nov 22 04:09:42 crc kubenswrapper[4952]: I1122 04:09:42.895986 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xff2w" event={"ID":"f75633c4-435b-493a-a5b8-4fd1cb779109","Type":"ContainerDied","Data":"4753d92d5e8aeb5a2eb47a6d25f4ba0d32b95de15470a1d0bb6be69acf14cb1f"} Nov 22 04:09:42 crc kubenswrapper[4952]: I1122 04:09:42.896009 4952 scope.go:117] "RemoveContainer" containerID="f55d0fd1868eae258e6d5e5aa9aaf3db3cce26699b541266409b808d70e3d150" Nov 22 04:09:42 crc kubenswrapper[4952]: I1122 04:09:42.932061 4952 scope.go:117] "RemoveContainer" containerID="b56804aba92dcffc57c86da10cfa1101fd256f1e04b7d01814b8a2a1eb634e79" Nov 22 04:09:42 crc kubenswrapper[4952]: I1122 04:09:42.934385 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xff2w"] Nov 22 04:09:42 crc kubenswrapper[4952]: I1122 04:09:42.945989 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xff2w"] Nov 22 04:09:42 crc kubenswrapper[4952]: I1122 04:09:42.967839 4952 scope.go:117] "RemoveContainer" containerID="4368565cf776872be00927091ee5a2c766a40ad1362eea4ab5d79843b991b476" Nov 22 04:09:43 crc kubenswrapper[4952]: I1122 04:09:43.009082 4952 scope.go:117] "RemoveContainer" containerID="f55d0fd1868eae258e6d5e5aa9aaf3db3cce26699b541266409b808d70e3d150" Nov 22 04:09:43 crc kubenswrapper[4952]: E1122 04:09:43.010424 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f55d0fd1868eae258e6d5e5aa9aaf3db3cce26699b541266409b808d70e3d150\": container with ID starting with f55d0fd1868eae258e6d5e5aa9aaf3db3cce26699b541266409b808d70e3d150 not found: ID does not exist" containerID="f55d0fd1868eae258e6d5e5aa9aaf3db3cce26699b541266409b808d70e3d150" Nov 22 04:09:43 crc kubenswrapper[4952]: I1122 04:09:43.010476 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f55d0fd1868eae258e6d5e5aa9aaf3db3cce26699b541266409b808d70e3d150"} err="failed to get container status \"f55d0fd1868eae258e6d5e5aa9aaf3db3cce26699b541266409b808d70e3d150\": rpc error: code = NotFound desc = could not find container \"f55d0fd1868eae258e6d5e5aa9aaf3db3cce26699b541266409b808d70e3d150\": container with ID starting with f55d0fd1868eae258e6d5e5aa9aaf3db3cce26699b541266409b808d70e3d150 not found: ID does not exist" Nov 22 04:09:43 crc kubenswrapper[4952]: I1122 04:09:43.010507 4952 scope.go:117] "RemoveContainer" containerID="b56804aba92dcffc57c86da10cfa1101fd256f1e04b7d01814b8a2a1eb634e79" Nov 22 04:09:43 crc kubenswrapper[4952]: E1122 04:09:43.010905 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b56804aba92dcffc57c86da10cfa1101fd256f1e04b7d01814b8a2a1eb634e79\": container with ID starting with b56804aba92dcffc57c86da10cfa1101fd256f1e04b7d01814b8a2a1eb634e79 not found: ID does not exist" containerID="b56804aba92dcffc57c86da10cfa1101fd256f1e04b7d01814b8a2a1eb634e79" Nov 22 04:09:43 crc kubenswrapper[4952]: I1122 04:09:43.010937 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b56804aba92dcffc57c86da10cfa1101fd256f1e04b7d01814b8a2a1eb634e79"} err="failed to get container status \"b56804aba92dcffc57c86da10cfa1101fd256f1e04b7d01814b8a2a1eb634e79\": rpc error: code = NotFound desc = could not find container \"b56804aba92dcffc57c86da10cfa1101fd256f1e04b7d01814b8a2a1eb634e79\": container with ID starting with b56804aba92dcffc57c86da10cfa1101fd256f1e04b7d01814b8a2a1eb634e79 not found: ID does not exist" Nov 22 04:09:43 crc kubenswrapper[4952]: I1122 04:09:43.010955 4952 scope.go:117] "RemoveContainer" containerID="4368565cf776872be00927091ee5a2c766a40ad1362eea4ab5d79843b991b476" Nov 22 04:09:43 crc kubenswrapper[4952]: E1122 04:09:43.011339 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4368565cf776872be00927091ee5a2c766a40ad1362eea4ab5d79843b991b476\": container with ID starting with 4368565cf776872be00927091ee5a2c766a40ad1362eea4ab5d79843b991b476 not found: ID does not exist" containerID="4368565cf776872be00927091ee5a2c766a40ad1362eea4ab5d79843b991b476" Nov 22 04:09:43 crc kubenswrapper[4952]: I1122 04:09:43.011382 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4368565cf776872be00927091ee5a2c766a40ad1362eea4ab5d79843b991b476"} err="failed to get container status \"4368565cf776872be00927091ee5a2c766a40ad1362eea4ab5d79843b991b476\": rpc error: code = NotFound desc = could not find container \"4368565cf776872be00927091ee5a2c766a40ad1362eea4ab5d79843b991b476\": container with ID starting with 4368565cf776872be00927091ee5a2c766a40ad1362eea4ab5d79843b991b476 not found: ID does not exist" Nov 22 04:09:44 crc kubenswrapper[4952]: I1122 04:09:44.545924 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f75633c4-435b-493a-a5b8-4fd1cb779109" path="/var/lib/kubelet/pods/f75633c4-435b-493a-a5b8-4fd1cb779109/volumes" Nov 22 04:09:50 crc kubenswrapper[4952]: I1122 04:09:50.532058 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:09:50 crc kubenswrapper[4952]: E1122 04:09:50.532786 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:10:03 crc kubenswrapper[4952]: I1122 04:10:03.532340 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:10:04 crc kubenswrapper[4952]: I1122 04:10:04.181713 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"5990ed4acfae91ca029ca1ac3b1ebe6496d1dc57eebe1b3f78308af3fb814783"} Nov 22 04:11:29 crc kubenswrapper[4952]: I1122 04:11:29.733396 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ba66b462-c52b-4474-80c9-670bf6be8870" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Nov 22 04:11:35 crc kubenswrapper[4952]: I1122 04:11:35.733728 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ba66b462-c52b-4474-80c9-670bf6be8870" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Nov 22 04:11:36 crc kubenswrapper[4952]: I1122 04:11:36.186883 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="2ff51385-a462-478d-bd61-62d15d7c5c41" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.155:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 04:11:37 crc kubenswrapper[4952]: I1122 04:11:37.363673 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-backup-0" podUID="e99cda79-b32c-4e09-8c24-9a4eb0c934ef" containerName="cinder-backup" probeResult="failure" output="Get \"http://10.217.0.232:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 04:11:37 crc kubenswrapper[4952]: I1122 04:11:37.363696 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-volume-volume1-0" podUID="6dd872c8-ca07-4e06-9666-22d89916ead1" containerName="cinder-volume" probeResult="failure" output="Get \"http://10.217.0.233:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 04:11:37 crc kubenswrapper[4952]: I1122 04:11:37.617247 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ba66b462-c52b-4474-80c9-670bf6be8870" containerName="ceilometer-central-agent" probeResult="failure" output=< Nov 22 04:11:37 crc kubenswrapper[4952]: Unkown error: Expecting value: line 1 column 1 (char 0) Nov 22 04:11:37 crc kubenswrapper[4952]: > Nov 22 04:11:37 crc kubenswrapper[4952]: I1122 04:11:37.617351 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Nov 22 04:11:37 crc kubenswrapper[4952]: I1122 04:11:37.618523 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"c21565680c74156ecf3f9c8f03c688603e116314d6601ed9cd5d62a4ecda977f"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Nov 22 04:11:37 crc kubenswrapper[4952]: I1122 04:11:37.618699 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba66b462-c52b-4474-80c9-670bf6be8870" containerName="ceilometer-central-agent" containerID="cri-o://c21565680c74156ecf3f9c8f03c688603e116314d6601ed9cd5d62a4ecda977f" gracePeriod=30 Nov 22 04:11:39 crc kubenswrapper[4952]: I1122 04:11:39.223478 4952 generic.go:334] "Generic (PLEG): container finished" podID="ba66b462-c52b-4474-80c9-670bf6be8870" containerID="c21565680c74156ecf3f9c8f03c688603e116314d6601ed9cd5d62a4ecda977f" exitCode=0 Nov 22 04:11:39 crc kubenswrapper[4952]: I1122 04:11:39.223593 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba66b462-c52b-4474-80c9-670bf6be8870","Type":"ContainerDied","Data":"c21565680c74156ecf3f9c8f03c688603e116314d6601ed9cd5d62a4ecda977f"} Nov 22 04:11:39 crc kubenswrapper[4952]: I1122 04:11:39.246384 4952 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:11:42 crc kubenswrapper[4952]: I1122 04:11:42.256400 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba66b462-c52b-4474-80c9-670bf6be8870","Type":"ContainerStarted","Data":"b6cdf74ec631ab75965f762fb2a8a9405aee1300a558ce664acb0d40e9ea7247"} Nov 22 04:12:28 crc kubenswrapper[4952]: I1122 04:12:28.342379 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:12:28 crc kubenswrapper[4952]: I1122 04:12:28.344261 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:12:58 crc kubenswrapper[4952]: I1122 04:12:58.342412 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:12:58 crc kubenswrapper[4952]: I1122 04:12:58.342896 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:13:28 crc kubenswrapper[4952]: I1122 04:13:28.341478 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:13:28 crc kubenswrapper[4952]: I1122 04:13:28.342043 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:13:28 crc kubenswrapper[4952]: I1122 04:13:28.342093 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 04:13:28 crc kubenswrapper[4952]: I1122 04:13:28.343117 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5990ed4acfae91ca029ca1ac3b1ebe6496d1dc57eebe1b3f78308af3fb814783"} pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:13:28 crc kubenswrapper[4952]: I1122 04:13:28.343187 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" containerID="cri-o://5990ed4acfae91ca029ca1ac3b1ebe6496d1dc57eebe1b3f78308af3fb814783" gracePeriod=600 Nov 22 04:13:29 crc kubenswrapper[4952]: I1122 04:13:29.283481 4952 generic.go:334] "Generic (PLEG): container finished" podID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerID="5990ed4acfae91ca029ca1ac3b1ebe6496d1dc57eebe1b3f78308af3fb814783" exitCode=0 Nov 22 04:13:29 crc kubenswrapper[4952]: I1122 04:13:29.283531 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerDied","Data":"5990ed4acfae91ca029ca1ac3b1ebe6496d1dc57eebe1b3f78308af3fb814783"} Nov 22 04:13:29 crc kubenswrapper[4952]: I1122 04:13:29.283844 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05"} Nov 22 04:13:29 crc kubenswrapper[4952]: I1122 04:13:29.283864 4952 scope.go:117] "RemoveContainer" containerID="d03c2c71cb02b9189d3552e5bc61b7c4fd4b8cc7be7758153a411a6669fcd1f5" Nov 22 04:14:49 crc kubenswrapper[4952]: I1122 04:14:49.620519 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mfkrp"] Nov 22 04:14:49 crc kubenswrapper[4952]: E1122 04:14:49.622177 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f75633c4-435b-493a-a5b8-4fd1cb779109" containerName="registry-server" Nov 22 04:14:49 crc kubenswrapper[4952]: I1122 04:14:49.622199 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="f75633c4-435b-493a-a5b8-4fd1cb779109" containerName="registry-server" Nov 22 04:14:49 crc kubenswrapper[4952]: E1122 04:14:49.622240 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f75633c4-435b-493a-a5b8-4fd1cb779109" containerName="extract-content" Nov 22 04:14:49 crc kubenswrapper[4952]: I1122 04:14:49.622248 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="f75633c4-435b-493a-a5b8-4fd1cb779109" containerName="extract-content" Nov 22 04:14:49 crc kubenswrapper[4952]: E1122 04:14:49.622310 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f75633c4-435b-493a-a5b8-4fd1cb779109" containerName="extract-utilities" Nov 22 04:14:49 crc kubenswrapper[4952]: I1122 04:14:49.622325 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="f75633c4-435b-493a-a5b8-4fd1cb779109" containerName="extract-utilities" Nov 22 04:14:49 crc kubenswrapper[4952]: I1122 04:14:49.622601 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="f75633c4-435b-493a-a5b8-4fd1cb779109" containerName="registry-server" Nov 22 04:14:49 crc kubenswrapper[4952]: I1122 04:14:49.625506 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mfkrp" Nov 22 04:14:49 crc kubenswrapper[4952]: I1122 04:14:49.634661 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mfkrp"] Nov 22 04:14:49 crc kubenswrapper[4952]: I1122 04:14:49.730171 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20364d86-f630-4bc4-86e3-220d711e01bf-utilities\") pod \"redhat-operators-mfkrp\" (UID: \"20364d86-f630-4bc4-86e3-220d711e01bf\") " pod="openshift-marketplace/redhat-operators-mfkrp" Nov 22 04:14:49 crc kubenswrapper[4952]: I1122 04:14:49.730940 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20364d86-f630-4bc4-86e3-220d711e01bf-catalog-content\") pod \"redhat-operators-mfkrp\" (UID: \"20364d86-f630-4bc4-86e3-220d711e01bf\") " pod="openshift-marketplace/redhat-operators-mfkrp" Nov 22 04:14:49 crc kubenswrapper[4952]: I1122 04:14:49.730977 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn5mp\" (UniqueName: \"kubernetes.io/projected/20364d86-f630-4bc4-86e3-220d711e01bf-kube-api-access-bn5mp\") pod \"redhat-operators-mfkrp\" (UID: \"20364d86-f630-4bc4-86e3-220d711e01bf\") " pod="openshift-marketplace/redhat-operators-mfkrp" Nov 22 04:14:49 crc kubenswrapper[4952]: I1122 04:14:49.833263 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20364d86-f630-4bc4-86e3-220d711e01bf-catalog-content\") pod \"redhat-operators-mfkrp\" (UID: \"20364d86-f630-4bc4-86e3-220d711e01bf\") " pod="openshift-marketplace/redhat-operators-mfkrp" Nov 22 04:14:49 crc kubenswrapper[4952]: I1122 04:14:49.833342 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn5mp\" (UniqueName: \"kubernetes.io/projected/20364d86-f630-4bc4-86e3-220d711e01bf-kube-api-access-bn5mp\") pod \"redhat-operators-mfkrp\" (UID: \"20364d86-f630-4bc4-86e3-220d711e01bf\") " pod="openshift-marketplace/redhat-operators-mfkrp" Nov 22 04:14:49 crc kubenswrapper[4952]: I1122 04:14:49.833601 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20364d86-f630-4bc4-86e3-220d711e01bf-utilities\") pod \"redhat-operators-mfkrp\" (UID: \"20364d86-f630-4bc4-86e3-220d711e01bf\") " pod="openshift-marketplace/redhat-operators-mfkrp" Nov 22 04:14:49 crc kubenswrapper[4952]: I1122 04:14:49.834387 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20364d86-f630-4bc4-86e3-220d711e01bf-utilities\") pod \"redhat-operators-mfkrp\" (UID: \"20364d86-f630-4bc4-86e3-220d711e01bf\") " pod="openshift-marketplace/redhat-operators-mfkrp" Nov 22 04:14:49 crc kubenswrapper[4952]: I1122 04:14:49.834782 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20364d86-f630-4bc4-86e3-220d711e01bf-catalog-content\") pod \"redhat-operators-mfkrp\" (UID: \"20364d86-f630-4bc4-86e3-220d711e01bf\") " pod="openshift-marketplace/redhat-operators-mfkrp" Nov 22 04:14:49 crc kubenswrapper[4952]: I1122 04:14:49.878506 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn5mp\" (UniqueName: \"kubernetes.io/projected/20364d86-f630-4bc4-86e3-220d711e01bf-kube-api-access-bn5mp\") pod \"redhat-operators-mfkrp\" (UID: \"20364d86-f630-4bc4-86e3-220d711e01bf\") " pod="openshift-marketplace/redhat-operators-mfkrp" Nov 22 04:14:49 crc kubenswrapper[4952]: I1122 04:14:49.947333 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mfkrp" Nov 22 04:14:50 crc kubenswrapper[4952]: I1122 04:14:50.547246 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mfkrp"] Nov 22 04:14:50 crc kubenswrapper[4952]: I1122 04:14:50.980940 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l8hrq"] Nov 22 04:14:50 crc kubenswrapper[4952]: I1122 04:14:50.989184 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l8hrq" Nov 22 04:14:51 crc kubenswrapper[4952]: I1122 04:14:51.040278 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l8hrq"] Nov 22 04:14:51 crc kubenswrapper[4952]: I1122 04:14:51.073048 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw8ng\" (UniqueName: \"kubernetes.io/projected/0be8daeb-e687-43ba-a52d-33a21f12db04-kube-api-access-vw8ng\") pod \"certified-operators-l8hrq\" (UID: \"0be8daeb-e687-43ba-a52d-33a21f12db04\") " pod="openshift-marketplace/certified-operators-l8hrq" Nov 22 04:14:51 crc kubenswrapper[4952]: I1122 04:14:51.073100 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0be8daeb-e687-43ba-a52d-33a21f12db04-catalog-content\") pod \"certified-operators-l8hrq\" (UID: \"0be8daeb-e687-43ba-a52d-33a21f12db04\") " pod="openshift-marketplace/certified-operators-l8hrq" Nov 22 04:14:51 crc kubenswrapper[4952]: I1122 04:14:51.073162 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0be8daeb-e687-43ba-a52d-33a21f12db04-utilities\") pod \"certified-operators-l8hrq\" (UID: \"0be8daeb-e687-43ba-a52d-33a21f12db04\") " pod="openshift-marketplace/certified-operators-l8hrq" Nov 22 04:14:51 crc kubenswrapper[4952]: I1122 04:14:51.125429 4952 generic.go:334] "Generic (PLEG): container finished" podID="20364d86-f630-4bc4-86e3-220d711e01bf" containerID="a0ac5e473eabd235f3e78ea895aafbfbe889a30d9c70b6b1381eec05268b677d" exitCode=0 Nov 22 04:14:51 crc kubenswrapper[4952]: I1122 04:14:51.125615 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mfkrp" event={"ID":"20364d86-f630-4bc4-86e3-220d711e01bf","Type":"ContainerDied","Data":"a0ac5e473eabd235f3e78ea895aafbfbe889a30d9c70b6b1381eec05268b677d"} Nov 22 04:14:51 crc kubenswrapper[4952]: I1122 04:14:51.126089 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mfkrp" event={"ID":"20364d86-f630-4bc4-86e3-220d711e01bf","Type":"ContainerStarted","Data":"c1e97359f15d421b7e2993483e8dd70ca349c81ce15dbb2350e5ce45113df019"} Nov 22 04:14:51 crc kubenswrapper[4952]: I1122 04:14:51.175301 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0be8daeb-e687-43ba-a52d-33a21f12db04-utilities\") pod \"certified-operators-l8hrq\" (UID: \"0be8daeb-e687-43ba-a52d-33a21f12db04\") " pod="openshift-marketplace/certified-operators-l8hrq" Nov 22 04:14:51 crc kubenswrapper[4952]: I1122 04:14:51.175499 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw8ng\" (UniqueName: \"kubernetes.io/projected/0be8daeb-e687-43ba-a52d-33a21f12db04-kube-api-access-vw8ng\") pod \"certified-operators-l8hrq\" (UID: \"0be8daeb-e687-43ba-a52d-33a21f12db04\") " pod="openshift-marketplace/certified-operators-l8hrq" Nov 22 04:14:51 crc kubenswrapper[4952]: I1122 04:14:51.175569 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0be8daeb-e687-43ba-a52d-33a21f12db04-catalog-content\") pod \"certified-operators-l8hrq\" (UID: \"0be8daeb-e687-43ba-a52d-33a21f12db04\") " pod="openshift-marketplace/certified-operators-l8hrq" Nov 22 04:14:51 crc kubenswrapper[4952]: I1122 04:14:51.176245 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0be8daeb-e687-43ba-a52d-33a21f12db04-catalog-content\") pod \"certified-operators-l8hrq\" (UID: \"0be8daeb-e687-43ba-a52d-33a21f12db04\") " pod="openshift-marketplace/certified-operators-l8hrq" Nov 22 04:14:51 crc kubenswrapper[4952]: I1122 04:14:51.176418 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0be8daeb-e687-43ba-a52d-33a21f12db04-utilities\") pod \"certified-operators-l8hrq\" (UID: \"0be8daeb-e687-43ba-a52d-33a21f12db04\") " pod="openshift-marketplace/certified-operators-l8hrq" Nov 22 04:14:51 crc kubenswrapper[4952]: I1122 04:14:51.195168 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw8ng\" (UniqueName: \"kubernetes.io/projected/0be8daeb-e687-43ba-a52d-33a21f12db04-kube-api-access-vw8ng\") pod \"certified-operators-l8hrq\" (UID: \"0be8daeb-e687-43ba-a52d-33a21f12db04\") " pod="openshift-marketplace/certified-operators-l8hrq" Nov 22 04:14:51 crc kubenswrapper[4952]: I1122 04:14:51.352670 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l8hrq" Nov 22 04:14:51 crc kubenswrapper[4952]: I1122 04:14:51.924418 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l8hrq"] Nov 22 04:14:51 crc kubenswrapper[4952]: W1122 04:14:51.925952 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0be8daeb_e687_43ba_a52d_33a21f12db04.slice/crio-323bff33b494256c950620a8ea5a4a52844b6c8facc67b466238ac9f462a38d3 WatchSource:0}: Error finding container 323bff33b494256c950620a8ea5a4a52844b6c8facc67b466238ac9f462a38d3: Status 404 returned error can't find the container with id 323bff33b494256c950620a8ea5a4a52844b6c8facc67b466238ac9f462a38d3 Nov 22 04:14:52 crc kubenswrapper[4952]: I1122 04:14:52.137895 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l8hrq" event={"ID":"0be8daeb-e687-43ba-a52d-33a21f12db04","Type":"ContainerStarted","Data":"323bff33b494256c950620a8ea5a4a52844b6c8facc67b466238ac9f462a38d3"} Nov 22 04:14:53 crc kubenswrapper[4952]: I1122 04:14:53.151513 4952 generic.go:334] "Generic (PLEG): container finished" podID="0be8daeb-e687-43ba-a52d-33a21f12db04" containerID="19d2ec7441e402ba2947a7228b36b1377573494bc2ce296733a00f0f3446a5b6" exitCode=0 Nov 22 04:14:53 crc kubenswrapper[4952]: I1122 04:14:53.151598 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l8hrq" event={"ID":"0be8daeb-e687-43ba-a52d-33a21f12db04","Type":"ContainerDied","Data":"19d2ec7441e402ba2947a7228b36b1377573494bc2ce296733a00f0f3446a5b6"} Nov 22 04:14:54 crc kubenswrapper[4952]: I1122 04:14:54.177414 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mfkrp" event={"ID":"20364d86-f630-4bc4-86e3-220d711e01bf","Type":"ContainerStarted","Data":"8219eb348be763327d2985a543dd4003ffc37b6ec9769fa2a3ff8270fb718eb7"} Nov 22 04:14:55 crc kubenswrapper[4952]: I1122 04:14:55.188607 4952 generic.go:334] "Generic (PLEG): container finished" podID="20364d86-f630-4bc4-86e3-220d711e01bf" containerID="8219eb348be763327d2985a543dd4003ffc37b6ec9769fa2a3ff8270fb718eb7" exitCode=0 Nov 22 04:14:55 crc kubenswrapper[4952]: I1122 04:14:55.188697 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mfkrp" event={"ID":"20364d86-f630-4bc4-86e3-220d711e01bf","Type":"ContainerDied","Data":"8219eb348be763327d2985a543dd4003ffc37b6ec9769fa2a3ff8270fb718eb7"} Nov 22 04:14:56 crc kubenswrapper[4952]: I1122 04:14:56.202635 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l8hrq" event={"ID":"0be8daeb-e687-43ba-a52d-33a21f12db04","Type":"ContainerStarted","Data":"9032d986a07fb68c423bbec9b67f7e48734a87df88dcac18154f2011c84ebd22"} Nov 22 04:14:57 crc kubenswrapper[4952]: I1122 04:14:57.217000 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mfkrp" event={"ID":"20364d86-f630-4bc4-86e3-220d711e01bf","Type":"ContainerStarted","Data":"c34f5760dd44b8d0c3f0b6d240e6dbdc3a1837c09067989158f4cb280585a9f4"} Nov 22 04:14:57 crc kubenswrapper[4952]: I1122 04:14:57.260081 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mfkrp" podStartSLOduration=3.15076864 podStartE2EDuration="8.260054376s" podCreationTimestamp="2025-11-22 04:14:49 +0000 UTC" firstStartedPulling="2025-11-22 04:14:51.128883196 +0000 UTC m=+4855.434900489" lastFinishedPulling="2025-11-22 04:14:56.238168912 +0000 UTC m=+4860.544186225" observedRunningTime="2025-11-22 04:14:57.243588169 +0000 UTC m=+4861.549605472" watchObservedRunningTime="2025-11-22 04:14:57.260054376 +0000 UTC m=+4861.566071689" Nov 22 04:14:59 crc kubenswrapper[4952]: I1122 04:14:59.948432 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mfkrp" Nov 22 04:14:59 crc kubenswrapper[4952]: I1122 04:14:59.949008 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mfkrp" Nov 22 04:15:00 crc kubenswrapper[4952]: I1122 04:15:00.148749 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396415-dcvpn"] Nov 22 04:15:00 crc kubenswrapper[4952]: I1122 04:15:00.150238 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-dcvpn" Nov 22 04:15:00 crc kubenswrapper[4952]: I1122 04:15:00.154913 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 04:15:00 crc kubenswrapper[4952]: I1122 04:15:00.158830 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 04:15:00 crc kubenswrapper[4952]: I1122 04:15:00.159109 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396415-dcvpn"] Nov 22 04:15:00 crc kubenswrapper[4952]: I1122 04:15:00.287871 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwdxb\" (UniqueName: \"kubernetes.io/projected/64ed9f9c-80c9-4553-aeef-d4fb16a11760-kube-api-access-lwdxb\") pod \"collect-profiles-29396415-dcvpn\" (UID: \"64ed9f9c-80c9-4553-aeef-d4fb16a11760\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-dcvpn" Nov 22 04:15:00 crc kubenswrapper[4952]: I1122 04:15:00.288221 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/64ed9f9c-80c9-4553-aeef-d4fb16a11760-config-volume\") pod \"collect-profiles-29396415-dcvpn\" (UID: \"64ed9f9c-80c9-4553-aeef-d4fb16a11760\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-dcvpn" Nov 22 04:15:00 crc kubenswrapper[4952]: I1122 04:15:00.288363 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/64ed9f9c-80c9-4553-aeef-d4fb16a11760-secret-volume\") pod \"collect-profiles-29396415-dcvpn\" (UID: \"64ed9f9c-80c9-4553-aeef-d4fb16a11760\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-dcvpn" Nov 22 04:15:00 crc kubenswrapper[4952]: I1122 04:15:00.390097 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwdxb\" (UniqueName: \"kubernetes.io/projected/64ed9f9c-80c9-4553-aeef-d4fb16a11760-kube-api-access-lwdxb\") pod \"collect-profiles-29396415-dcvpn\" (UID: \"64ed9f9c-80c9-4553-aeef-d4fb16a11760\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-dcvpn" Nov 22 04:15:00 crc kubenswrapper[4952]: I1122 04:15:00.390162 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/64ed9f9c-80c9-4553-aeef-d4fb16a11760-config-volume\") pod \"collect-profiles-29396415-dcvpn\" (UID: \"64ed9f9c-80c9-4553-aeef-d4fb16a11760\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-dcvpn" Nov 22 04:15:00 crc kubenswrapper[4952]: I1122 04:15:00.390279 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/64ed9f9c-80c9-4553-aeef-d4fb16a11760-secret-volume\") pod \"collect-profiles-29396415-dcvpn\" (UID: \"64ed9f9c-80c9-4553-aeef-d4fb16a11760\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-dcvpn" Nov 22 04:15:00 crc kubenswrapper[4952]: I1122 04:15:00.391144 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/64ed9f9c-80c9-4553-aeef-d4fb16a11760-config-volume\") pod \"collect-profiles-29396415-dcvpn\" (UID: \"64ed9f9c-80c9-4553-aeef-d4fb16a11760\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-dcvpn" Nov 22 04:15:00 crc kubenswrapper[4952]: I1122 04:15:00.397632 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/64ed9f9c-80c9-4553-aeef-d4fb16a11760-secret-volume\") pod \"collect-profiles-29396415-dcvpn\" (UID: \"64ed9f9c-80c9-4553-aeef-d4fb16a11760\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-dcvpn" Nov 22 04:15:00 crc kubenswrapper[4952]: I1122 04:15:00.429003 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwdxb\" (UniqueName: \"kubernetes.io/projected/64ed9f9c-80c9-4553-aeef-d4fb16a11760-kube-api-access-lwdxb\") pod \"collect-profiles-29396415-dcvpn\" (UID: \"64ed9f9c-80c9-4553-aeef-d4fb16a11760\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-dcvpn" Nov 22 04:15:00 crc kubenswrapper[4952]: I1122 04:15:00.490995 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-dcvpn" Nov 22 04:15:01 crc kubenswrapper[4952]: I1122 04:15:01.008586 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396415-dcvpn"] Nov 22 04:15:01 crc kubenswrapper[4952]: I1122 04:15:01.255027 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-dcvpn" event={"ID":"64ed9f9c-80c9-4553-aeef-d4fb16a11760","Type":"ContainerStarted","Data":"9ff7b87ea0b636e68ce28a1e5a8e668db288a8d92bb7d9fbc3bb966e3db52b77"} Nov 22 04:15:01 crc kubenswrapper[4952]: I1122 04:15:01.606859 4952 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mfkrp" podUID="20364d86-f630-4bc4-86e3-220d711e01bf" containerName="registry-server" probeResult="failure" output=< Nov 22 04:15:01 crc kubenswrapper[4952]: timeout: failed to connect service ":50051" within 1s Nov 22 04:15:01 crc kubenswrapper[4952]: > Nov 22 04:15:04 crc kubenswrapper[4952]: I1122 04:15:04.301098 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-dcvpn" event={"ID":"64ed9f9c-80c9-4553-aeef-d4fb16a11760","Type":"ContainerStarted","Data":"519782cb1374c36aa4c6cc4afce414c07612904a00d664b7296f88de500894b0"} Nov 22 04:15:05 crc kubenswrapper[4952]: I1122 04:15:05.313417 4952 generic.go:334] "Generic (PLEG): container finished" podID="64ed9f9c-80c9-4553-aeef-d4fb16a11760" containerID="519782cb1374c36aa4c6cc4afce414c07612904a00d664b7296f88de500894b0" exitCode=0 Nov 22 04:15:05 crc kubenswrapper[4952]: I1122 04:15:05.313506 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-dcvpn" event={"ID":"64ed9f9c-80c9-4553-aeef-d4fb16a11760","Type":"ContainerDied","Data":"519782cb1374c36aa4c6cc4afce414c07612904a00d664b7296f88de500894b0"} Nov 22 04:15:06 crc kubenswrapper[4952]: I1122 04:15:06.330463 4952 generic.go:334] "Generic (PLEG): container finished" podID="0be8daeb-e687-43ba-a52d-33a21f12db04" containerID="9032d986a07fb68c423bbec9b67f7e48734a87df88dcac18154f2011c84ebd22" exitCode=0 Nov 22 04:15:06 crc kubenswrapper[4952]: I1122 04:15:06.330586 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l8hrq" event={"ID":"0be8daeb-e687-43ba-a52d-33a21f12db04","Type":"ContainerDied","Data":"9032d986a07fb68c423bbec9b67f7e48734a87df88dcac18154f2011c84ebd22"} Nov 22 04:15:07 crc kubenswrapper[4952]: I1122 04:15:07.343797 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-dcvpn" event={"ID":"64ed9f9c-80c9-4553-aeef-d4fb16a11760","Type":"ContainerDied","Data":"9ff7b87ea0b636e68ce28a1e5a8e668db288a8d92bb7d9fbc3bb966e3db52b77"} Nov 22 04:15:07 crc kubenswrapper[4952]: I1122 04:15:07.344207 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ff7b87ea0b636e68ce28a1e5a8e668db288a8d92bb7d9fbc3bb966e3db52b77" Nov 22 04:15:07 crc kubenswrapper[4952]: I1122 04:15:07.439675 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-dcvpn" Nov 22 04:15:07 crc kubenswrapper[4952]: I1122 04:15:07.552520 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/64ed9f9c-80c9-4553-aeef-d4fb16a11760-config-volume\") pod \"64ed9f9c-80c9-4553-aeef-d4fb16a11760\" (UID: \"64ed9f9c-80c9-4553-aeef-d4fb16a11760\") " Nov 22 04:15:07 crc kubenswrapper[4952]: I1122 04:15:07.552616 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwdxb\" (UniqueName: \"kubernetes.io/projected/64ed9f9c-80c9-4553-aeef-d4fb16a11760-kube-api-access-lwdxb\") pod \"64ed9f9c-80c9-4553-aeef-d4fb16a11760\" (UID: \"64ed9f9c-80c9-4553-aeef-d4fb16a11760\") " Nov 22 04:15:07 crc kubenswrapper[4952]: I1122 04:15:07.552713 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/64ed9f9c-80c9-4553-aeef-d4fb16a11760-secret-volume\") pod \"64ed9f9c-80c9-4553-aeef-d4fb16a11760\" (UID: \"64ed9f9c-80c9-4553-aeef-d4fb16a11760\") " Nov 22 04:15:07 crc kubenswrapper[4952]: I1122 04:15:07.553939 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64ed9f9c-80c9-4553-aeef-d4fb16a11760-config-volume" (OuterVolumeSpecName: "config-volume") pod "64ed9f9c-80c9-4553-aeef-d4fb16a11760" (UID: "64ed9f9c-80c9-4553-aeef-d4fb16a11760"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:15:07 crc kubenswrapper[4952]: I1122 04:15:07.563846 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ed9f9c-80c9-4553-aeef-d4fb16a11760-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "64ed9f9c-80c9-4553-aeef-d4fb16a11760" (UID: "64ed9f9c-80c9-4553-aeef-d4fb16a11760"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:15:07 crc kubenswrapper[4952]: I1122 04:15:07.563875 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64ed9f9c-80c9-4553-aeef-d4fb16a11760-kube-api-access-lwdxb" (OuterVolumeSpecName: "kube-api-access-lwdxb") pod "64ed9f9c-80c9-4553-aeef-d4fb16a11760" (UID: "64ed9f9c-80c9-4553-aeef-d4fb16a11760"). InnerVolumeSpecName "kube-api-access-lwdxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:15:07 crc kubenswrapper[4952]: I1122 04:15:07.656037 4952 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/64ed9f9c-80c9-4553-aeef-d4fb16a11760-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:07 crc kubenswrapper[4952]: I1122 04:15:07.656089 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwdxb\" (UniqueName: \"kubernetes.io/projected/64ed9f9c-80c9-4553-aeef-d4fb16a11760-kube-api-access-lwdxb\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:07 crc kubenswrapper[4952]: I1122 04:15:07.656110 4952 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/64ed9f9c-80c9-4553-aeef-d4fb16a11760-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:08 crc kubenswrapper[4952]: I1122 04:15:08.354867 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-dcvpn" Nov 22 04:15:08 crc kubenswrapper[4952]: I1122 04:15:08.544080 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k"] Nov 22 04:15:08 crc kubenswrapper[4952]: I1122 04:15:08.551107 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396370-pvz2k"] Nov 22 04:15:09 crc kubenswrapper[4952]: I1122 04:15:09.992372 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mfkrp" Nov 22 04:15:10 crc kubenswrapper[4952]: I1122 04:15:10.040377 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mfkrp" Nov 22 04:15:10 crc kubenswrapper[4952]: I1122 04:15:10.236446 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mfkrp"] Nov 22 04:15:10 crc kubenswrapper[4952]: I1122 04:15:10.548155 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf9f9c23-32be-47e9-85fa-91ed3572291e" path="/var/lib/kubelet/pods/bf9f9c23-32be-47e9-85fa-91ed3572291e/volumes" Nov 22 04:15:11 crc kubenswrapper[4952]: I1122 04:15:11.387531 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mfkrp" podUID="20364d86-f630-4bc4-86e3-220d711e01bf" containerName="registry-server" containerID="cri-o://c34f5760dd44b8d0c3f0b6d240e6dbdc3a1837c09067989158f4cb280585a9f4" gracePeriod=2 Nov 22 04:15:12 crc kubenswrapper[4952]: I1122 04:15:12.410639 4952 generic.go:334] "Generic (PLEG): container finished" podID="20364d86-f630-4bc4-86e3-220d711e01bf" containerID="c34f5760dd44b8d0c3f0b6d240e6dbdc3a1837c09067989158f4cb280585a9f4" exitCode=0 Nov 22 04:15:12 crc kubenswrapper[4952]: I1122 04:15:12.410765 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mfkrp" event={"ID":"20364d86-f630-4bc4-86e3-220d711e01bf","Type":"ContainerDied","Data":"c34f5760dd44b8d0c3f0b6d240e6dbdc3a1837c09067989158f4cb280585a9f4"} Nov 22 04:15:12 crc kubenswrapper[4952]: I1122 04:15:12.414925 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l8hrq" event={"ID":"0be8daeb-e687-43ba-a52d-33a21f12db04","Type":"ContainerStarted","Data":"da0de953cb61389ef4cd8680420c392ea8fd666ab0a74d50c0c8618d0ee7eadc"} Nov 22 04:15:12 crc kubenswrapper[4952]: I1122 04:15:12.442290 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l8hrq" podStartSLOduration=3.677147559 podStartE2EDuration="22.442267313s" podCreationTimestamp="2025-11-22 04:14:50 +0000 UTC" firstStartedPulling="2025-11-22 04:14:53.284587331 +0000 UTC m=+4857.590604604" lastFinishedPulling="2025-11-22 04:15:12.049707055 +0000 UTC m=+4876.355724358" observedRunningTime="2025-11-22 04:15:12.438227236 +0000 UTC m=+4876.744244529" watchObservedRunningTime="2025-11-22 04:15:12.442267313 +0000 UTC m=+4876.748284606" Nov 22 04:15:12 crc kubenswrapper[4952]: I1122 04:15:12.752874 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mfkrp" Nov 22 04:15:12 crc kubenswrapper[4952]: I1122 04:15:12.883916 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn5mp\" (UniqueName: \"kubernetes.io/projected/20364d86-f630-4bc4-86e3-220d711e01bf-kube-api-access-bn5mp\") pod \"20364d86-f630-4bc4-86e3-220d711e01bf\" (UID: \"20364d86-f630-4bc4-86e3-220d711e01bf\") " Nov 22 04:15:12 crc kubenswrapper[4952]: I1122 04:15:12.884291 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20364d86-f630-4bc4-86e3-220d711e01bf-catalog-content\") pod \"20364d86-f630-4bc4-86e3-220d711e01bf\" (UID: \"20364d86-f630-4bc4-86e3-220d711e01bf\") " Nov 22 04:15:12 crc kubenswrapper[4952]: I1122 04:15:12.884337 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20364d86-f630-4bc4-86e3-220d711e01bf-utilities\") pod \"20364d86-f630-4bc4-86e3-220d711e01bf\" (UID: \"20364d86-f630-4bc4-86e3-220d711e01bf\") " Nov 22 04:15:12 crc kubenswrapper[4952]: I1122 04:15:12.886850 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20364d86-f630-4bc4-86e3-220d711e01bf-utilities" (OuterVolumeSpecName: "utilities") pod "20364d86-f630-4bc4-86e3-220d711e01bf" (UID: "20364d86-f630-4bc4-86e3-220d711e01bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:15:12 crc kubenswrapper[4952]: I1122 04:15:12.903328 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20364d86-f630-4bc4-86e3-220d711e01bf-kube-api-access-bn5mp" (OuterVolumeSpecName: "kube-api-access-bn5mp") pod "20364d86-f630-4bc4-86e3-220d711e01bf" (UID: "20364d86-f630-4bc4-86e3-220d711e01bf"). InnerVolumeSpecName "kube-api-access-bn5mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:15:12 crc kubenswrapper[4952]: I1122 04:15:12.987347 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20364d86-f630-4bc4-86e3-220d711e01bf-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:12 crc kubenswrapper[4952]: I1122 04:15:12.987388 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn5mp\" (UniqueName: \"kubernetes.io/projected/20364d86-f630-4bc4-86e3-220d711e01bf-kube-api-access-bn5mp\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:12 crc kubenswrapper[4952]: I1122 04:15:12.989256 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20364d86-f630-4bc4-86e3-220d711e01bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20364d86-f630-4bc4-86e3-220d711e01bf" (UID: "20364d86-f630-4bc4-86e3-220d711e01bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:15:13 crc kubenswrapper[4952]: I1122 04:15:13.090121 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20364d86-f630-4bc4-86e3-220d711e01bf-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:13 crc kubenswrapper[4952]: I1122 04:15:13.431069 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mfkrp" event={"ID":"20364d86-f630-4bc4-86e3-220d711e01bf","Type":"ContainerDied","Data":"c1e97359f15d421b7e2993483e8dd70ca349c81ce15dbb2350e5ce45113df019"} Nov 22 04:15:13 crc kubenswrapper[4952]: I1122 04:15:13.431172 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mfkrp" Nov 22 04:15:13 crc kubenswrapper[4952]: I1122 04:15:13.431355 4952 scope.go:117] "RemoveContainer" containerID="c34f5760dd44b8d0c3f0b6d240e6dbdc3a1837c09067989158f4cb280585a9f4" Nov 22 04:15:13 crc kubenswrapper[4952]: I1122 04:15:13.480574 4952 scope.go:117] "RemoveContainer" containerID="8219eb348be763327d2985a543dd4003ffc37b6ec9769fa2a3ff8270fb718eb7" Nov 22 04:15:13 crc kubenswrapper[4952]: I1122 04:15:13.480798 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mfkrp"] Nov 22 04:15:13 crc kubenswrapper[4952]: I1122 04:15:13.489285 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mfkrp"] Nov 22 04:15:14 crc kubenswrapper[4952]: I1122 04:15:14.045816 4952 scope.go:117] "RemoveContainer" containerID="a0ac5e473eabd235f3e78ea895aafbfbe889a30d9c70b6b1381eec05268b677d" Nov 22 04:15:14 crc kubenswrapper[4952]: I1122 04:15:14.545352 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20364d86-f630-4bc4-86e3-220d711e01bf" path="/var/lib/kubelet/pods/20364d86-f630-4bc4-86e3-220d711e01bf/volumes" Nov 22 04:15:21 crc kubenswrapper[4952]: I1122 04:15:21.353697 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l8hrq" Nov 22 04:15:21 crc kubenswrapper[4952]: I1122 04:15:21.355119 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l8hrq" Nov 22 04:15:21 crc kubenswrapper[4952]: I1122 04:15:21.426154 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l8hrq" Nov 22 04:15:21 crc kubenswrapper[4952]: I1122 04:15:21.577055 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l8hrq" Nov 22 04:15:22 crc kubenswrapper[4952]: I1122 04:15:22.182901 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l8hrq"] Nov 22 04:15:23 crc kubenswrapper[4952]: I1122 04:15:23.543161 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l8hrq" podUID="0be8daeb-e687-43ba-a52d-33a21f12db04" containerName="registry-server" containerID="cri-o://da0de953cb61389ef4cd8680420c392ea8fd666ab0a74d50c0c8618d0ee7eadc" gracePeriod=2 Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.094331 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l8hrq" Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.241432 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw8ng\" (UniqueName: \"kubernetes.io/projected/0be8daeb-e687-43ba-a52d-33a21f12db04-kube-api-access-vw8ng\") pod \"0be8daeb-e687-43ba-a52d-33a21f12db04\" (UID: \"0be8daeb-e687-43ba-a52d-33a21f12db04\") " Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.241495 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0be8daeb-e687-43ba-a52d-33a21f12db04-utilities\") pod \"0be8daeb-e687-43ba-a52d-33a21f12db04\" (UID: \"0be8daeb-e687-43ba-a52d-33a21f12db04\") " Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.241689 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0be8daeb-e687-43ba-a52d-33a21f12db04-catalog-content\") pod \"0be8daeb-e687-43ba-a52d-33a21f12db04\" (UID: \"0be8daeb-e687-43ba-a52d-33a21f12db04\") " Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.242691 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0be8daeb-e687-43ba-a52d-33a21f12db04-utilities" (OuterVolumeSpecName: "utilities") pod "0be8daeb-e687-43ba-a52d-33a21f12db04" (UID: "0be8daeb-e687-43ba-a52d-33a21f12db04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.247767 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0be8daeb-e687-43ba-a52d-33a21f12db04-kube-api-access-vw8ng" (OuterVolumeSpecName: "kube-api-access-vw8ng") pod "0be8daeb-e687-43ba-a52d-33a21f12db04" (UID: "0be8daeb-e687-43ba-a52d-33a21f12db04"). InnerVolumeSpecName "kube-api-access-vw8ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.304359 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0be8daeb-e687-43ba-a52d-33a21f12db04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0be8daeb-e687-43ba-a52d-33a21f12db04" (UID: "0be8daeb-e687-43ba-a52d-33a21f12db04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.343986 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw8ng\" (UniqueName: \"kubernetes.io/projected/0be8daeb-e687-43ba-a52d-33a21f12db04-kube-api-access-vw8ng\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.344019 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0be8daeb-e687-43ba-a52d-33a21f12db04-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.344030 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0be8daeb-e687-43ba-a52d-33a21f12db04-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.557024 4952 generic.go:334] "Generic (PLEG): container finished" podID="0be8daeb-e687-43ba-a52d-33a21f12db04" containerID="da0de953cb61389ef4cd8680420c392ea8fd666ab0a74d50c0c8618d0ee7eadc" exitCode=0 Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.557079 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l8hrq" event={"ID":"0be8daeb-e687-43ba-a52d-33a21f12db04","Type":"ContainerDied","Data":"da0de953cb61389ef4cd8680420c392ea8fd666ab0a74d50c0c8618d0ee7eadc"} Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.557113 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l8hrq" event={"ID":"0be8daeb-e687-43ba-a52d-33a21f12db04","Type":"ContainerDied","Data":"323bff33b494256c950620a8ea5a4a52844b6c8facc67b466238ac9f462a38d3"} Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.557111 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l8hrq" Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.557130 4952 scope.go:117] "RemoveContainer" containerID="da0de953cb61389ef4cd8680420c392ea8fd666ab0a74d50c0c8618d0ee7eadc" Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.602152 4952 scope.go:117] "RemoveContainer" containerID="9032d986a07fb68c423bbec9b67f7e48734a87df88dcac18154f2011c84ebd22" Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.609149 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l8hrq"] Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.616074 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l8hrq"] Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.638969 4952 scope.go:117] "RemoveContainer" containerID="19d2ec7441e402ba2947a7228b36b1377573494bc2ce296733a00f0f3446a5b6" Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.684109 4952 scope.go:117] "RemoveContainer" containerID="da0de953cb61389ef4cd8680420c392ea8fd666ab0a74d50c0c8618d0ee7eadc" Nov 22 04:15:24 crc kubenswrapper[4952]: E1122 04:15:24.686196 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da0de953cb61389ef4cd8680420c392ea8fd666ab0a74d50c0c8618d0ee7eadc\": container with ID starting with da0de953cb61389ef4cd8680420c392ea8fd666ab0a74d50c0c8618d0ee7eadc not found: ID does not exist" containerID="da0de953cb61389ef4cd8680420c392ea8fd666ab0a74d50c0c8618d0ee7eadc" Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.686249 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da0de953cb61389ef4cd8680420c392ea8fd666ab0a74d50c0c8618d0ee7eadc"} err="failed to get container status \"da0de953cb61389ef4cd8680420c392ea8fd666ab0a74d50c0c8618d0ee7eadc\": rpc error: code = NotFound desc = could not find container \"da0de953cb61389ef4cd8680420c392ea8fd666ab0a74d50c0c8618d0ee7eadc\": container with ID starting with da0de953cb61389ef4cd8680420c392ea8fd666ab0a74d50c0c8618d0ee7eadc not found: ID does not exist" Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.686281 4952 scope.go:117] "RemoveContainer" containerID="9032d986a07fb68c423bbec9b67f7e48734a87df88dcac18154f2011c84ebd22" Nov 22 04:15:24 crc kubenswrapper[4952]: E1122 04:15:24.686943 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9032d986a07fb68c423bbec9b67f7e48734a87df88dcac18154f2011c84ebd22\": container with ID starting with 9032d986a07fb68c423bbec9b67f7e48734a87df88dcac18154f2011c84ebd22 not found: ID does not exist" containerID="9032d986a07fb68c423bbec9b67f7e48734a87df88dcac18154f2011c84ebd22" Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.686992 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9032d986a07fb68c423bbec9b67f7e48734a87df88dcac18154f2011c84ebd22"} err="failed to get container status \"9032d986a07fb68c423bbec9b67f7e48734a87df88dcac18154f2011c84ebd22\": rpc error: code = NotFound desc = could not find container \"9032d986a07fb68c423bbec9b67f7e48734a87df88dcac18154f2011c84ebd22\": container with ID starting with 9032d986a07fb68c423bbec9b67f7e48734a87df88dcac18154f2011c84ebd22 not found: ID does not exist" Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.687020 4952 scope.go:117] "RemoveContainer" containerID="19d2ec7441e402ba2947a7228b36b1377573494bc2ce296733a00f0f3446a5b6" Nov 22 04:15:24 crc kubenswrapper[4952]: E1122 04:15:24.687480 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19d2ec7441e402ba2947a7228b36b1377573494bc2ce296733a00f0f3446a5b6\": container with ID starting with 19d2ec7441e402ba2947a7228b36b1377573494bc2ce296733a00f0f3446a5b6 not found: ID does not exist" containerID="19d2ec7441e402ba2947a7228b36b1377573494bc2ce296733a00f0f3446a5b6" Nov 22 04:15:24 crc kubenswrapper[4952]: I1122 04:15:24.687534 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19d2ec7441e402ba2947a7228b36b1377573494bc2ce296733a00f0f3446a5b6"} err="failed to get container status \"19d2ec7441e402ba2947a7228b36b1377573494bc2ce296733a00f0f3446a5b6\": rpc error: code = NotFound desc = could not find container \"19d2ec7441e402ba2947a7228b36b1377573494bc2ce296733a00f0f3446a5b6\": container with ID starting with 19d2ec7441e402ba2947a7228b36b1377573494bc2ce296733a00f0f3446a5b6 not found: ID does not exist" Nov 22 04:15:26 crc kubenswrapper[4952]: I1122 04:15:26.544192 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0be8daeb-e687-43ba-a52d-33a21f12db04" path="/var/lib/kubelet/pods/0be8daeb-e687-43ba-a52d-33a21f12db04/volumes" Nov 22 04:15:28 crc kubenswrapper[4952]: I1122 04:15:28.342088 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:15:28 crc kubenswrapper[4952]: I1122 04:15:28.342170 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:15:58 crc kubenswrapper[4952]: I1122 04:15:58.341758 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:15:58 crc kubenswrapper[4952]: I1122 04:15:58.342272 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:16:04 crc kubenswrapper[4952]: I1122 04:16:04.177987 4952 scope.go:117] "RemoveContainer" containerID="91f211b25270cea94909e052ea7f67f868614318f5dbbffa3ddf0ec178eec2e6" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.681283 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ctvrd"] Nov 22 04:16:08 crc kubenswrapper[4952]: E1122 04:16:08.682087 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20364d86-f630-4bc4-86e3-220d711e01bf" containerName="extract-content" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.682100 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="20364d86-f630-4bc4-86e3-220d711e01bf" containerName="extract-content" Nov 22 04:16:08 crc kubenswrapper[4952]: E1122 04:16:08.682112 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20364d86-f630-4bc4-86e3-220d711e01bf" containerName="extract-utilities" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.682118 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="20364d86-f630-4bc4-86e3-220d711e01bf" containerName="extract-utilities" Nov 22 04:16:08 crc kubenswrapper[4952]: E1122 04:16:08.682138 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20364d86-f630-4bc4-86e3-220d711e01bf" containerName="registry-server" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.682144 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="20364d86-f630-4bc4-86e3-220d711e01bf" containerName="registry-server" Nov 22 04:16:08 crc kubenswrapper[4952]: E1122 04:16:08.682159 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be8daeb-e687-43ba-a52d-33a21f12db04" containerName="extract-content" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.682165 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be8daeb-e687-43ba-a52d-33a21f12db04" containerName="extract-content" Nov 22 04:16:08 crc kubenswrapper[4952]: E1122 04:16:08.682177 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be8daeb-e687-43ba-a52d-33a21f12db04" containerName="registry-server" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.682182 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be8daeb-e687-43ba-a52d-33a21f12db04" containerName="registry-server" Nov 22 04:16:08 crc kubenswrapper[4952]: E1122 04:16:08.682190 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ed9f9c-80c9-4553-aeef-d4fb16a11760" containerName="collect-profiles" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.682195 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ed9f9c-80c9-4553-aeef-d4fb16a11760" containerName="collect-profiles" Nov 22 04:16:08 crc kubenswrapper[4952]: E1122 04:16:08.682205 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be8daeb-e687-43ba-a52d-33a21f12db04" containerName="extract-utilities" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.682212 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be8daeb-e687-43ba-a52d-33a21f12db04" containerName="extract-utilities" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.682405 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="0be8daeb-e687-43ba-a52d-33a21f12db04" containerName="registry-server" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.682422 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ed9f9c-80c9-4553-aeef-d4fb16a11760" containerName="collect-profiles" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.682434 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="20364d86-f630-4bc4-86e3-220d711e01bf" containerName="registry-server" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.683719 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ctvrd" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.697044 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ctvrd"] Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.810152 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh2bj\" (UniqueName: \"kubernetes.io/projected/2803f723-3676-4783-a98f-eac6228ef06b-kube-api-access-vh2bj\") pod \"community-operators-ctvrd\" (UID: \"2803f723-3676-4783-a98f-eac6228ef06b\") " pod="openshift-marketplace/community-operators-ctvrd" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.810519 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2803f723-3676-4783-a98f-eac6228ef06b-utilities\") pod \"community-operators-ctvrd\" (UID: \"2803f723-3676-4783-a98f-eac6228ef06b\") " pod="openshift-marketplace/community-operators-ctvrd" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.810752 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2803f723-3676-4783-a98f-eac6228ef06b-catalog-content\") pod \"community-operators-ctvrd\" (UID: \"2803f723-3676-4783-a98f-eac6228ef06b\") " pod="openshift-marketplace/community-operators-ctvrd" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.912750 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh2bj\" (UniqueName: \"kubernetes.io/projected/2803f723-3676-4783-a98f-eac6228ef06b-kube-api-access-vh2bj\") pod \"community-operators-ctvrd\" (UID: \"2803f723-3676-4783-a98f-eac6228ef06b\") " pod="openshift-marketplace/community-operators-ctvrd" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.912828 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2803f723-3676-4783-a98f-eac6228ef06b-utilities\") pod \"community-operators-ctvrd\" (UID: \"2803f723-3676-4783-a98f-eac6228ef06b\") " pod="openshift-marketplace/community-operators-ctvrd" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.912890 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2803f723-3676-4783-a98f-eac6228ef06b-catalog-content\") pod \"community-operators-ctvrd\" (UID: \"2803f723-3676-4783-a98f-eac6228ef06b\") " pod="openshift-marketplace/community-operators-ctvrd" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.913443 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2803f723-3676-4783-a98f-eac6228ef06b-catalog-content\") pod \"community-operators-ctvrd\" (UID: \"2803f723-3676-4783-a98f-eac6228ef06b\") " pod="openshift-marketplace/community-operators-ctvrd" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.914188 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2803f723-3676-4783-a98f-eac6228ef06b-utilities\") pod \"community-operators-ctvrd\" (UID: \"2803f723-3676-4783-a98f-eac6228ef06b\") " pod="openshift-marketplace/community-operators-ctvrd" Nov 22 04:16:08 crc kubenswrapper[4952]: I1122 04:16:08.942478 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh2bj\" (UniqueName: \"kubernetes.io/projected/2803f723-3676-4783-a98f-eac6228ef06b-kube-api-access-vh2bj\") pod \"community-operators-ctvrd\" (UID: \"2803f723-3676-4783-a98f-eac6228ef06b\") " pod="openshift-marketplace/community-operators-ctvrd" Nov 22 04:16:09 crc kubenswrapper[4952]: I1122 04:16:09.019431 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ctvrd" Nov 22 04:16:09 crc kubenswrapper[4952]: I1122 04:16:09.577170 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ctvrd"] Nov 22 04:16:10 crc kubenswrapper[4952]: I1122 04:16:10.027856 4952 generic.go:334] "Generic (PLEG): container finished" podID="2803f723-3676-4783-a98f-eac6228ef06b" containerID="c353453d9b914c39f6c64a546c39e0328e1420d82d5881e10c1251975543b1ea" exitCode=0 Nov 22 04:16:10 crc kubenswrapper[4952]: I1122 04:16:10.028004 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ctvrd" event={"ID":"2803f723-3676-4783-a98f-eac6228ef06b","Type":"ContainerDied","Data":"c353453d9b914c39f6c64a546c39e0328e1420d82d5881e10c1251975543b1ea"} Nov 22 04:16:10 crc kubenswrapper[4952]: I1122 04:16:10.028158 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ctvrd" event={"ID":"2803f723-3676-4783-a98f-eac6228ef06b","Type":"ContainerStarted","Data":"0723c166a5abcb45c4d16357a5e52cc67871f38c5e3517cb019ac45960d770b0"} Nov 22 04:16:12 crc kubenswrapper[4952]: I1122 04:16:12.055611 4952 generic.go:334] "Generic (PLEG): container finished" podID="2803f723-3676-4783-a98f-eac6228ef06b" containerID="b1166a91b148559c53eb901fdd4692c2320355d470738ca4aef46d1cafd9d2fc" exitCode=0 Nov 22 04:16:12 crc kubenswrapper[4952]: I1122 04:16:12.055837 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ctvrd" event={"ID":"2803f723-3676-4783-a98f-eac6228ef06b","Type":"ContainerDied","Data":"b1166a91b148559c53eb901fdd4692c2320355d470738ca4aef46d1cafd9d2fc"} Nov 22 04:16:14 crc kubenswrapper[4952]: I1122 04:16:14.076905 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ctvrd" event={"ID":"2803f723-3676-4783-a98f-eac6228ef06b","Type":"ContainerStarted","Data":"80ff016d70ac241aa24d5c9961331d9e87f3152a48e53e4f0e7d63d7716f809d"} Nov 22 04:16:14 crc kubenswrapper[4952]: I1122 04:16:14.107774 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ctvrd" podStartSLOduration=3.674841217 podStartE2EDuration="6.107753122s" podCreationTimestamp="2025-11-22 04:16:08 +0000 UTC" firstStartedPulling="2025-11-22 04:16:10.03350675 +0000 UTC m=+4934.339524023" lastFinishedPulling="2025-11-22 04:16:12.466418615 +0000 UTC m=+4936.772435928" observedRunningTime="2025-11-22 04:16:14.101604729 +0000 UTC m=+4938.407622012" watchObservedRunningTime="2025-11-22 04:16:14.107753122 +0000 UTC m=+4938.413770405" Nov 22 04:16:19 crc kubenswrapper[4952]: I1122 04:16:19.020432 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ctvrd" Nov 22 04:16:19 crc kubenswrapper[4952]: I1122 04:16:19.021059 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ctvrd" Nov 22 04:16:19 crc kubenswrapper[4952]: I1122 04:16:19.079843 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ctvrd" Nov 22 04:16:19 crc kubenswrapper[4952]: I1122 04:16:19.173534 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ctvrd" Nov 22 04:16:19 crc kubenswrapper[4952]: I1122 04:16:19.323176 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ctvrd"] Nov 22 04:16:21 crc kubenswrapper[4952]: I1122 04:16:21.140192 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ctvrd" podUID="2803f723-3676-4783-a98f-eac6228ef06b" containerName="registry-server" containerID="cri-o://80ff016d70ac241aa24d5c9961331d9e87f3152a48e53e4f0e7d63d7716f809d" gracePeriod=2 Nov 22 04:16:21 crc kubenswrapper[4952]: I1122 04:16:21.757634 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ctvrd" Nov 22 04:16:21 crc kubenswrapper[4952]: I1122 04:16:21.796590 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2803f723-3676-4783-a98f-eac6228ef06b-utilities\") pod \"2803f723-3676-4783-a98f-eac6228ef06b\" (UID: \"2803f723-3676-4783-a98f-eac6228ef06b\") " Nov 22 04:16:21 crc kubenswrapper[4952]: I1122 04:16:21.796756 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh2bj\" (UniqueName: \"kubernetes.io/projected/2803f723-3676-4783-a98f-eac6228ef06b-kube-api-access-vh2bj\") pod \"2803f723-3676-4783-a98f-eac6228ef06b\" (UID: \"2803f723-3676-4783-a98f-eac6228ef06b\") " Nov 22 04:16:21 crc kubenswrapper[4952]: I1122 04:16:21.796803 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2803f723-3676-4783-a98f-eac6228ef06b-catalog-content\") pod \"2803f723-3676-4783-a98f-eac6228ef06b\" (UID: \"2803f723-3676-4783-a98f-eac6228ef06b\") " Nov 22 04:16:21 crc kubenswrapper[4952]: I1122 04:16:21.797904 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2803f723-3676-4783-a98f-eac6228ef06b-utilities" (OuterVolumeSpecName: "utilities") pod "2803f723-3676-4783-a98f-eac6228ef06b" (UID: "2803f723-3676-4783-a98f-eac6228ef06b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:16:21 crc kubenswrapper[4952]: I1122 04:16:21.804648 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2803f723-3676-4783-a98f-eac6228ef06b-kube-api-access-vh2bj" (OuterVolumeSpecName: "kube-api-access-vh2bj") pod "2803f723-3676-4783-a98f-eac6228ef06b" (UID: "2803f723-3676-4783-a98f-eac6228ef06b"). InnerVolumeSpecName "kube-api-access-vh2bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:16:21 crc kubenswrapper[4952]: I1122 04:16:21.866072 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2803f723-3676-4783-a98f-eac6228ef06b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2803f723-3676-4783-a98f-eac6228ef06b" (UID: "2803f723-3676-4783-a98f-eac6228ef06b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:16:21 crc kubenswrapper[4952]: I1122 04:16:21.899280 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2803f723-3676-4783-a98f-eac6228ef06b-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:21 crc kubenswrapper[4952]: I1122 04:16:21.899326 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh2bj\" (UniqueName: \"kubernetes.io/projected/2803f723-3676-4783-a98f-eac6228ef06b-kube-api-access-vh2bj\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:21 crc kubenswrapper[4952]: I1122 04:16:21.899343 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2803f723-3676-4783-a98f-eac6228ef06b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:22 crc kubenswrapper[4952]: I1122 04:16:22.153645 4952 generic.go:334] "Generic (PLEG): container finished" podID="2803f723-3676-4783-a98f-eac6228ef06b" containerID="80ff016d70ac241aa24d5c9961331d9e87f3152a48e53e4f0e7d63d7716f809d" exitCode=0 Nov 22 04:16:22 crc kubenswrapper[4952]: I1122 04:16:22.153716 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ctvrd" event={"ID":"2803f723-3676-4783-a98f-eac6228ef06b","Type":"ContainerDied","Data":"80ff016d70ac241aa24d5c9961331d9e87f3152a48e53e4f0e7d63d7716f809d"} Nov 22 04:16:22 crc kubenswrapper[4952]: I1122 04:16:22.153758 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ctvrd" event={"ID":"2803f723-3676-4783-a98f-eac6228ef06b","Type":"ContainerDied","Data":"0723c166a5abcb45c4d16357a5e52cc67871f38c5e3517cb019ac45960d770b0"} Nov 22 04:16:22 crc kubenswrapper[4952]: I1122 04:16:22.153801 4952 scope.go:117] "RemoveContainer" containerID="80ff016d70ac241aa24d5c9961331d9e87f3152a48e53e4f0e7d63d7716f809d" Nov 22 04:16:22 crc kubenswrapper[4952]: I1122 04:16:22.154020 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ctvrd" Nov 22 04:16:22 crc kubenswrapper[4952]: I1122 04:16:22.192098 4952 scope.go:117] "RemoveContainer" containerID="b1166a91b148559c53eb901fdd4692c2320355d470738ca4aef46d1cafd9d2fc" Nov 22 04:16:22 crc kubenswrapper[4952]: I1122 04:16:22.210689 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ctvrd"] Nov 22 04:16:22 crc kubenswrapper[4952]: I1122 04:16:22.217921 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ctvrd"] Nov 22 04:16:22 crc kubenswrapper[4952]: I1122 04:16:22.228754 4952 scope.go:117] "RemoveContainer" containerID="c353453d9b914c39f6c64a546c39e0328e1420d82d5881e10c1251975543b1ea" Nov 22 04:16:22 crc kubenswrapper[4952]: I1122 04:16:22.279916 4952 scope.go:117] "RemoveContainer" containerID="80ff016d70ac241aa24d5c9961331d9e87f3152a48e53e4f0e7d63d7716f809d" Nov 22 04:16:22 crc kubenswrapper[4952]: E1122 04:16:22.280471 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80ff016d70ac241aa24d5c9961331d9e87f3152a48e53e4f0e7d63d7716f809d\": container with ID starting with 80ff016d70ac241aa24d5c9961331d9e87f3152a48e53e4f0e7d63d7716f809d not found: ID does not exist" containerID="80ff016d70ac241aa24d5c9961331d9e87f3152a48e53e4f0e7d63d7716f809d" Nov 22 04:16:22 crc kubenswrapper[4952]: I1122 04:16:22.280512 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80ff016d70ac241aa24d5c9961331d9e87f3152a48e53e4f0e7d63d7716f809d"} err="failed to get container status \"80ff016d70ac241aa24d5c9961331d9e87f3152a48e53e4f0e7d63d7716f809d\": rpc error: code = NotFound desc = could not find container \"80ff016d70ac241aa24d5c9961331d9e87f3152a48e53e4f0e7d63d7716f809d\": container with ID starting with 80ff016d70ac241aa24d5c9961331d9e87f3152a48e53e4f0e7d63d7716f809d not found: ID does not exist" Nov 22 04:16:22 crc kubenswrapper[4952]: I1122 04:16:22.280604 4952 scope.go:117] "RemoveContainer" containerID="b1166a91b148559c53eb901fdd4692c2320355d470738ca4aef46d1cafd9d2fc" Nov 22 04:16:22 crc kubenswrapper[4952]: E1122 04:16:22.281049 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1166a91b148559c53eb901fdd4692c2320355d470738ca4aef46d1cafd9d2fc\": container with ID starting with b1166a91b148559c53eb901fdd4692c2320355d470738ca4aef46d1cafd9d2fc not found: ID does not exist" containerID="b1166a91b148559c53eb901fdd4692c2320355d470738ca4aef46d1cafd9d2fc" Nov 22 04:16:22 crc kubenswrapper[4952]: I1122 04:16:22.281068 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1166a91b148559c53eb901fdd4692c2320355d470738ca4aef46d1cafd9d2fc"} err="failed to get container status \"b1166a91b148559c53eb901fdd4692c2320355d470738ca4aef46d1cafd9d2fc\": rpc error: code = NotFound desc = could not find container \"b1166a91b148559c53eb901fdd4692c2320355d470738ca4aef46d1cafd9d2fc\": container with ID starting with b1166a91b148559c53eb901fdd4692c2320355d470738ca4aef46d1cafd9d2fc not found: ID does not exist" Nov 22 04:16:22 crc kubenswrapper[4952]: I1122 04:16:22.281083 4952 scope.go:117] "RemoveContainer" containerID="c353453d9b914c39f6c64a546c39e0328e1420d82d5881e10c1251975543b1ea" Nov 22 04:16:22 crc kubenswrapper[4952]: E1122 04:16:22.281431 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c353453d9b914c39f6c64a546c39e0328e1420d82d5881e10c1251975543b1ea\": container with ID starting with c353453d9b914c39f6c64a546c39e0328e1420d82d5881e10c1251975543b1ea not found: ID does not exist" containerID="c353453d9b914c39f6c64a546c39e0328e1420d82d5881e10c1251975543b1ea" Nov 22 04:16:22 crc kubenswrapper[4952]: I1122 04:16:22.281454 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c353453d9b914c39f6c64a546c39e0328e1420d82d5881e10c1251975543b1ea"} err="failed to get container status \"c353453d9b914c39f6c64a546c39e0328e1420d82d5881e10c1251975543b1ea\": rpc error: code = NotFound desc = could not find container \"c353453d9b914c39f6c64a546c39e0328e1420d82d5881e10c1251975543b1ea\": container with ID starting with c353453d9b914c39f6c64a546c39e0328e1420d82d5881e10c1251975543b1ea not found: ID does not exist" Nov 22 04:16:22 crc kubenswrapper[4952]: I1122 04:16:22.547396 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2803f723-3676-4783-a98f-eac6228ef06b" path="/var/lib/kubelet/pods/2803f723-3676-4783-a98f-eac6228ef06b/volumes" Nov 22 04:16:28 crc kubenswrapper[4952]: I1122 04:16:28.342299 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:16:28 crc kubenswrapper[4952]: I1122 04:16:28.342826 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:16:28 crc kubenswrapper[4952]: I1122 04:16:28.342877 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 04:16:28 crc kubenswrapper[4952]: I1122 04:16:28.343683 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05"} pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:16:28 crc kubenswrapper[4952]: I1122 04:16:28.343750 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" containerID="cri-o://20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" gracePeriod=600 Nov 22 04:16:28 crc kubenswrapper[4952]: E1122 04:16:28.514189 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:16:29 crc kubenswrapper[4952]: I1122 04:16:29.234291 4952 generic.go:334] "Generic (PLEG): container finished" podID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" exitCode=0 Nov 22 04:16:29 crc kubenswrapper[4952]: I1122 04:16:29.234336 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerDied","Data":"20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05"} Nov 22 04:16:29 crc kubenswrapper[4952]: I1122 04:16:29.234373 4952 scope.go:117] "RemoveContainer" containerID="5990ed4acfae91ca029ca1ac3b1ebe6496d1dc57eebe1b3f78308af3fb814783" Nov 22 04:16:29 crc kubenswrapper[4952]: I1122 04:16:29.235229 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:16:29 crc kubenswrapper[4952]: E1122 04:16:29.235695 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:16:43 crc kubenswrapper[4952]: I1122 04:16:43.532202 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:16:43 crc kubenswrapper[4952]: E1122 04:16:43.533110 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:16:58 crc kubenswrapper[4952]: I1122 04:16:58.531631 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:16:58 crc kubenswrapper[4952]: E1122 04:16:58.532669 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:17:13 crc kubenswrapper[4952]: I1122 04:17:13.531870 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:17:13 crc kubenswrapper[4952]: E1122 04:17:13.533014 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:17:26 crc kubenswrapper[4952]: I1122 04:17:26.548863 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:17:26 crc kubenswrapper[4952]: E1122 04:17:26.549692 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:17:39 crc kubenswrapper[4952]: I1122 04:17:39.532240 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:17:39 crc kubenswrapper[4952]: E1122 04:17:39.533264 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:17:53 crc kubenswrapper[4952]: I1122 04:17:53.530976 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:17:53 crc kubenswrapper[4952]: E1122 04:17:53.531678 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:18:04 crc kubenswrapper[4952]: I1122 04:18:04.531253 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:18:04 crc kubenswrapper[4952]: E1122 04:18:04.532676 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:18:09 crc kubenswrapper[4952]: I1122 04:18:09.728910 4952 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="9a3eb772-8262-4b28-873f-63f00885054d" containerName="galera" probeResult="failure" output="command timed out" Nov 22 04:18:09 crc kubenswrapper[4952]: I1122 04:18:09.729735 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="9a3eb772-8262-4b28-873f-63f00885054d" containerName="galera" probeResult="failure" output="command timed out" Nov 22 04:18:15 crc kubenswrapper[4952]: I1122 04:18:15.531675 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:18:15 crc kubenswrapper[4952]: E1122 04:18:15.532410 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:18:26 crc kubenswrapper[4952]: I1122 04:18:26.540167 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:18:26 crc kubenswrapper[4952]: E1122 04:18:26.541087 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:18:40 crc kubenswrapper[4952]: I1122 04:18:40.531799 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:18:40 crc kubenswrapper[4952]: E1122 04:18:40.532583 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:18:55 crc kubenswrapper[4952]: I1122 04:18:55.530899 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:18:55 crc kubenswrapper[4952]: E1122 04:18:55.533139 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:19:07 crc kubenswrapper[4952]: I1122 04:19:07.532389 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:19:07 crc kubenswrapper[4952]: E1122 04:19:07.533373 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:19:20 crc kubenswrapper[4952]: I1122 04:19:20.531862 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:19:20 crc kubenswrapper[4952]: E1122 04:19:20.532732 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:19:33 crc kubenswrapper[4952]: I1122 04:19:33.531202 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:19:33 crc kubenswrapper[4952]: E1122 04:19:33.531966 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:19:46 crc kubenswrapper[4952]: I1122 04:19:46.546349 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:19:46 crc kubenswrapper[4952]: E1122 04:19:46.547280 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:20:00 crc kubenswrapper[4952]: I1122 04:20:00.531349 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:20:00 crc kubenswrapper[4952]: E1122 04:20:00.532167 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:20:13 crc kubenswrapper[4952]: I1122 04:20:13.531450 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:20:13 crc kubenswrapper[4952]: E1122 04:20:13.532406 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:20:24 crc kubenswrapper[4952]: I1122 04:20:24.531324 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:20:24 crc kubenswrapper[4952]: E1122 04:20:24.532174 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:20:25 crc kubenswrapper[4952]: I1122 04:20:25.206188 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xjkxx"] Nov 22 04:20:25 crc kubenswrapper[4952]: E1122 04:20:25.206812 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2803f723-3676-4783-a98f-eac6228ef06b" containerName="extract-content" Nov 22 04:20:25 crc kubenswrapper[4952]: I1122 04:20:25.206841 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="2803f723-3676-4783-a98f-eac6228ef06b" containerName="extract-content" Nov 22 04:20:25 crc kubenswrapper[4952]: E1122 04:20:25.206882 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2803f723-3676-4783-a98f-eac6228ef06b" containerName="registry-server" Nov 22 04:20:25 crc kubenswrapper[4952]: I1122 04:20:25.206895 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="2803f723-3676-4783-a98f-eac6228ef06b" containerName="registry-server" Nov 22 04:20:25 crc kubenswrapper[4952]: E1122 04:20:25.206935 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2803f723-3676-4783-a98f-eac6228ef06b" containerName="extract-utilities" Nov 22 04:20:25 crc kubenswrapper[4952]: I1122 04:20:25.206949 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="2803f723-3676-4783-a98f-eac6228ef06b" containerName="extract-utilities" Nov 22 04:20:25 crc kubenswrapper[4952]: I1122 04:20:25.207323 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="2803f723-3676-4783-a98f-eac6228ef06b" containerName="registry-server" Nov 22 04:20:25 crc kubenswrapper[4952]: I1122 04:20:25.209826 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjkxx" Nov 22 04:20:25 crc kubenswrapper[4952]: I1122 04:20:25.230101 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjkxx"] Nov 22 04:20:25 crc kubenswrapper[4952]: I1122 04:20:25.273994 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c929ddf-cbd3-4b70-a8cd-6151088a2f30-utilities\") pod \"redhat-marketplace-xjkxx\" (UID: \"7c929ddf-cbd3-4b70-a8cd-6151088a2f30\") " pod="openshift-marketplace/redhat-marketplace-xjkxx" Nov 22 04:20:25 crc kubenswrapper[4952]: I1122 04:20:25.274149 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4tzs\" (UniqueName: \"kubernetes.io/projected/7c929ddf-cbd3-4b70-a8cd-6151088a2f30-kube-api-access-h4tzs\") pod \"redhat-marketplace-xjkxx\" (UID: \"7c929ddf-cbd3-4b70-a8cd-6151088a2f30\") " pod="openshift-marketplace/redhat-marketplace-xjkxx" Nov 22 04:20:25 crc kubenswrapper[4952]: I1122 04:20:25.274447 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c929ddf-cbd3-4b70-a8cd-6151088a2f30-catalog-content\") pod \"redhat-marketplace-xjkxx\" (UID: \"7c929ddf-cbd3-4b70-a8cd-6151088a2f30\") " pod="openshift-marketplace/redhat-marketplace-xjkxx" Nov 22 04:20:25 crc kubenswrapper[4952]: I1122 04:20:25.434577 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c929ddf-cbd3-4b70-a8cd-6151088a2f30-catalog-content\") pod \"redhat-marketplace-xjkxx\" (UID: \"7c929ddf-cbd3-4b70-a8cd-6151088a2f30\") " pod="openshift-marketplace/redhat-marketplace-xjkxx" Nov 22 04:20:25 crc kubenswrapper[4952]: I1122 04:20:25.434675 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c929ddf-cbd3-4b70-a8cd-6151088a2f30-utilities\") pod \"redhat-marketplace-xjkxx\" (UID: \"7c929ddf-cbd3-4b70-a8cd-6151088a2f30\") " pod="openshift-marketplace/redhat-marketplace-xjkxx" Nov 22 04:20:25 crc kubenswrapper[4952]: I1122 04:20:25.434924 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4tzs\" (UniqueName: \"kubernetes.io/projected/7c929ddf-cbd3-4b70-a8cd-6151088a2f30-kube-api-access-h4tzs\") pod \"redhat-marketplace-xjkxx\" (UID: \"7c929ddf-cbd3-4b70-a8cd-6151088a2f30\") " pod="openshift-marketplace/redhat-marketplace-xjkxx" Nov 22 04:20:25 crc kubenswrapper[4952]: I1122 04:20:25.435161 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c929ddf-cbd3-4b70-a8cd-6151088a2f30-catalog-content\") pod \"redhat-marketplace-xjkxx\" (UID: \"7c929ddf-cbd3-4b70-a8cd-6151088a2f30\") " pod="openshift-marketplace/redhat-marketplace-xjkxx" Nov 22 04:20:25 crc kubenswrapper[4952]: I1122 04:20:25.435205 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c929ddf-cbd3-4b70-a8cd-6151088a2f30-utilities\") pod \"redhat-marketplace-xjkxx\" (UID: \"7c929ddf-cbd3-4b70-a8cd-6151088a2f30\") " pod="openshift-marketplace/redhat-marketplace-xjkxx" Nov 22 04:20:25 crc kubenswrapper[4952]: I1122 04:20:25.459527 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4tzs\" (UniqueName: \"kubernetes.io/projected/7c929ddf-cbd3-4b70-a8cd-6151088a2f30-kube-api-access-h4tzs\") pod \"redhat-marketplace-xjkxx\" (UID: \"7c929ddf-cbd3-4b70-a8cd-6151088a2f30\") " pod="openshift-marketplace/redhat-marketplace-xjkxx" Nov 22 04:20:25 crc kubenswrapper[4952]: I1122 04:20:25.533610 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjkxx" Nov 22 04:20:26 crc kubenswrapper[4952]: I1122 04:20:26.049357 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjkxx"] Nov 22 04:20:26 crc kubenswrapper[4952]: I1122 04:20:26.689353 4952 generic.go:334] "Generic (PLEG): container finished" podID="7c929ddf-cbd3-4b70-a8cd-6151088a2f30" containerID="822dff6070c047ad0ec3230d88ed17b1d155ee3de1deb39ca837ef381a7c85c6" exitCode=0 Nov 22 04:20:26 crc kubenswrapper[4952]: I1122 04:20:26.689402 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjkxx" event={"ID":"7c929ddf-cbd3-4b70-a8cd-6151088a2f30","Type":"ContainerDied","Data":"822dff6070c047ad0ec3230d88ed17b1d155ee3de1deb39ca837ef381a7c85c6"} Nov 22 04:20:26 crc kubenswrapper[4952]: I1122 04:20:26.689435 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjkxx" event={"ID":"7c929ddf-cbd3-4b70-a8cd-6151088a2f30","Type":"ContainerStarted","Data":"cdcdca9ffe54d434246bcf7f1e081d3b0b25765d5875ff5e09e94b2e0e1b0d52"} Nov 22 04:20:26 crc kubenswrapper[4952]: I1122 04:20:26.692416 4952 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:20:28 crc kubenswrapper[4952]: I1122 04:20:28.708353 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjkxx" event={"ID":"7c929ddf-cbd3-4b70-a8cd-6151088a2f30","Type":"ContainerStarted","Data":"a8f73844df760520ad1fca25334a760c60380d69cb1d4dea880109601a91139d"} Nov 22 04:20:29 crc kubenswrapper[4952]: I1122 04:20:29.719231 4952 generic.go:334] "Generic (PLEG): container finished" podID="7c929ddf-cbd3-4b70-a8cd-6151088a2f30" containerID="a8f73844df760520ad1fca25334a760c60380d69cb1d4dea880109601a91139d" exitCode=0 Nov 22 04:20:29 crc kubenswrapper[4952]: I1122 04:20:29.719293 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjkxx" event={"ID":"7c929ddf-cbd3-4b70-a8cd-6151088a2f30","Type":"ContainerDied","Data":"a8f73844df760520ad1fca25334a760c60380d69cb1d4dea880109601a91139d"} Nov 22 04:20:30 crc kubenswrapper[4952]: I1122 04:20:30.754678 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjkxx" event={"ID":"7c929ddf-cbd3-4b70-a8cd-6151088a2f30","Type":"ContainerStarted","Data":"77573c644c1c8b807123120bfadc4036b5ab6ada66c8ae7b2f8b9249a04818d6"} Nov 22 04:20:30 crc kubenswrapper[4952]: I1122 04:20:30.780589 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xjkxx" podStartSLOduration=2.306840131 podStartE2EDuration="5.780572176s" podCreationTimestamp="2025-11-22 04:20:25 +0000 UTC" firstStartedPulling="2025-11-22 04:20:26.692165485 +0000 UTC m=+5190.998182758" lastFinishedPulling="2025-11-22 04:20:30.16589749 +0000 UTC m=+5194.471914803" observedRunningTime="2025-11-22 04:20:30.775967204 +0000 UTC m=+5195.081984497" watchObservedRunningTime="2025-11-22 04:20:30.780572176 +0000 UTC m=+5195.086589449" Nov 22 04:20:35 crc kubenswrapper[4952]: I1122 04:20:35.534527 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xjkxx" Nov 22 04:20:35 crc kubenswrapper[4952]: I1122 04:20:35.535161 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xjkxx" Nov 22 04:20:35 crc kubenswrapper[4952]: I1122 04:20:35.592763 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xjkxx" Nov 22 04:20:35 crc kubenswrapper[4952]: I1122 04:20:35.881885 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xjkxx" Nov 22 04:20:35 crc kubenswrapper[4952]: I1122 04:20:35.931242 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjkxx"] Nov 22 04:20:37 crc kubenswrapper[4952]: I1122 04:20:37.855267 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xjkxx" podUID="7c929ddf-cbd3-4b70-a8cd-6151088a2f30" containerName="registry-server" containerID="cri-o://77573c644c1c8b807123120bfadc4036b5ab6ada66c8ae7b2f8b9249a04818d6" gracePeriod=2 Nov 22 04:20:38 crc kubenswrapper[4952]: I1122 04:20:38.534817 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:20:38 crc kubenswrapper[4952]: E1122 04:20:38.535404 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:20:38 crc kubenswrapper[4952]: I1122 04:20:38.897917 4952 generic.go:334] "Generic (PLEG): container finished" podID="7c929ddf-cbd3-4b70-a8cd-6151088a2f30" containerID="77573c644c1c8b807123120bfadc4036b5ab6ada66c8ae7b2f8b9249a04818d6" exitCode=0 Nov 22 04:20:38 crc kubenswrapper[4952]: I1122 04:20:38.898245 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjkxx" event={"ID":"7c929ddf-cbd3-4b70-a8cd-6151088a2f30","Type":"ContainerDied","Data":"77573c644c1c8b807123120bfadc4036b5ab6ada66c8ae7b2f8b9249a04818d6"} Nov 22 04:20:38 crc kubenswrapper[4952]: I1122 04:20:38.898275 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjkxx" event={"ID":"7c929ddf-cbd3-4b70-a8cd-6151088a2f30","Type":"ContainerDied","Data":"cdcdca9ffe54d434246bcf7f1e081d3b0b25765d5875ff5e09e94b2e0e1b0d52"} Nov 22 04:20:38 crc kubenswrapper[4952]: I1122 04:20:38.898292 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdcdca9ffe54d434246bcf7f1e081d3b0b25765d5875ff5e09e94b2e0e1b0d52" Nov 22 04:20:38 crc kubenswrapper[4952]: I1122 04:20:38.969930 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjkxx" Nov 22 04:20:39 crc kubenswrapper[4952]: I1122 04:20:39.127868 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c929ddf-cbd3-4b70-a8cd-6151088a2f30-catalog-content\") pod \"7c929ddf-cbd3-4b70-a8cd-6151088a2f30\" (UID: \"7c929ddf-cbd3-4b70-a8cd-6151088a2f30\") " Nov 22 04:20:39 crc kubenswrapper[4952]: I1122 04:20:39.127981 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4tzs\" (UniqueName: \"kubernetes.io/projected/7c929ddf-cbd3-4b70-a8cd-6151088a2f30-kube-api-access-h4tzs\") pod \"7c929ddf-cbd3-4b70-a8cd-6151088a2f30\" (UID: \"7c929ddf-cbd3-4b70-a8cd-6151088a2f30\") " Nov 22 04:20:39 crc kubenswrapper[4952]: I1122 04:20:39.128126 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c929ddf-cbd3-4b70-a8cd-6151088a2f30-utilities\") pod \"7c929ddf-cbd3-4b70-a8cd-6151088a2f30\" (UID: \"7c929ddf-cbd3-4b70-a8cd-6151088a2f30\") " Nov 22 04:20:39 crc kubenswrapper[4952]: I1122 04:20:39.129422 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c929ddf-cbd3-4b70-a8cd-6151088a2f30-utilities" (OuterVolumeSpecName: "utilities") pod "7c929ddf-cbd3-4b70-a8cd-6151088a2f30" (UID: "7c929ddf-cbd3-4b70-a8cd-6151088a2f30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:20:39 crc kubenswrapper[4952]: I1122 04:20:39.138976 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c929ddf-cbd3-4b70-a8cd-6151088a2f30-kube-api-access-h4tzs" (OuterVolumeSpecName: "kube-api-access-h4tzs") pod "7c929ddf-cbd3-4b70-a8cd-6151088a2f30" (UID: "7c929ddf-cbd3-4b70-a8cd-6151088a2f30"). InnerVolumeSpecName "kube-api-access-h4tzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:20:39 crc kubenswrapper[4952]: I1122 04:20:39.146039 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c929ddf-cbd3-4b70-a8cd-6151088a2f30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c929ddf-cbd3-4b70-a8cd-6151088a2f30" (UID: "7c929ddf-cbd3-4b70-a8cd-6151088a2f30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:20:39 crc kubenswrapper[4952]: I1122 04:20:39.231426 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c929ddf-cbd3-4b70-a8cd-6151088a2f30-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:39 crc kubenswrapper[4952]: I1122 04:20:39.231471 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c929ddf-cbd3-4b70-a8cd-6151088a2f30-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:39 crc kubenswrapper[4952]: I1122 04:20:39.231494 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4tzs\" (UniqueName: \"kubernetes.io/projected/7c929ddf-cbd3-4b70-a8cd-6151088a2f30-kube-api-access-h4tzs\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:39 crc kubenswrapper[4952]: I1122 04:20:39.907597 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjkxx" Nov 22 04:20:39 crc kubenswrapper[4952]: I1122 04:20:39.963431 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjkxx"] Nov 22 04:20:39 crc kubenswrapper[4952]: I1122 04:20:39.972864 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjkxx"] Nov 22 04:20:40 crc kubenswrapper[4952]: I1122 04:20:40.544856 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c929ddf-cbd3-4b70-a8cd-6151088a2f30" path="/var/lib/kubelet/pods/7c929ddf-cbd3-4b70-a8cd-6151088a2f30/volumes" Nov 22 04:20:49 crc kubenswrapper[4952]: I1122 04:20:49.531288 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:20:49 crc kubenswrapper[4952]: E1122 04:20:49.532280 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:21:00 crc kubenswrapper[4952]: I1122 04:21:00.531323 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:21:00 crc kubenswrapper[4952]: E1122 04:21:00.532269 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:21:11 crc kubenswrapper[4952]: I1122 04:21:11.532483 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:21:11 crc kubenswrapper[4952]: E1122 04:21:11.533749 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:21:22 crc kubenswrapper[4952]: I1122 04:21:22.531992 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:21:22 crc kubenswrapper[4952]: E1122 04:21:22.533218 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:21:35 crc kubenswrapper[4952]: I1122 04:21:35.532248 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:21:36 crc kubenswrapper[4952]: I1122 04:21:36.495283 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"c89d348b574dc051cb51adc67db44a5be6af54aed9490de1b1cb04b258cd8eff"} Nov 22 04:23:56 crc kubenswrapper[4952]: I1122 04:23:56.911626 4952 generic.go:334] "Generic (PLEG): container finished" podID="b32e0459-7aee-4841-8281-da334fe3e8d8" containerID="88b54f14f1febd929130bb4eeb738873694fb90cdd27e9698f0b9b0fe899a3bd" exitCode=1 Nov 22 04:23:56 crc kubenswrapper[4952]: I1122 04:23:56.911642 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b32e0459-7aee-4841-8281-da334fe3e8d8","Type":"ContainerDied","Data":"88b54f14f1febd929130bb4eeb738873694fb90cdd27e9698f0b9b0fe899a3bd"} Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.342487 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.342938 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.756736 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.825903 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjpx4\" (UniqueName: \"kubernetes.io/projected/b32e0459-7aee-4841-8281-da334fe3e8d8-kube-api-access-kjpx4\") pod \"b32e0459-7aee-4841-8281-da334fe3e8d8\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.826038 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b32e0459-7aee-4841-8281-da334fe3e8d8-ca-certs\") pod \"b32e0459-7aee-4841-8281-da334fe3e8d8\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.826157 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b32e0459-7aee-4841-8281-da334fe3e8d8-ssh-key\") pod \"b32e0459-7aee-4841-8281-da334fe3e8d8\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.826203 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b32e0459-7aee-4841-8281-da334fe3e8d8-test-operator-ephemeral-workdir\") pod \"b32e0459-7aee-4841-8281-da334fe3e8d8\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.826280 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b32e0459-7aee-4841-8281-da334fe3e8d8-config-data\") pod \"b32e0459-7aee-4841-8281-da334fe3e8d8\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.826402 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b32e0459-7aee-4841-8281-da334fe3e8d8-openstack-config\") pod \"b32e0459-7aee-4841-8281-da334fe3e8d8\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.826452 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b32e0459-7aee-4841-8281-da334fe3e8d8-openstack-config-secret\") pod \"b32e0459-7aee-4841-8281-da334fe3e8d8\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.826493 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"b32e0459-7aee-4841-8281-da334fe3e8d8\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.826563 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b32e0459-7aee-4841-8281-da334fe3e8d8-test-operator-ephemeral-temporary\") pod \"b32e0459-7aee-4841-8281-da334fe3e8d8\" (UID: \"b32e0459-7aee-4841-8281-da334fe3e8d8\") " Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.827319 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b32e0459-7aee-4841-8281-da334fe3e8d8-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "b32e0459-7aee-4841-8281-da334fe3e8d8" (UID: "b32e0459-7aee-4841-8281-da334fe3e8d8"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.827771 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b32e0459-7aee-4841-8281-da334fe3e8d8-config-data" (OuterVolumeSpecName: "config-data") pod "b32e0459-7aee-4841-8281-da334fe3e8d8" (UID: "b32e0459-7aee-4841-8281-da334fe3e8d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.832832 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b32e0459-7aee-4841-8281-da334fe3e8d8-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "b32e0459-7aee-4841-8281-da334fe3e8d8" (UID: "b32e0459-7aee-4841-8281-da334fe3e8d8"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.841927 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "b32e0459-7aee-4841-8281-da334fe3e8d8" (UID: "b32e0459-7aee-4841-8281-da334fe3e8d8"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.845172 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b32e0459-7aee-4841-8281-da334fe3e8d8-kube-api-access-kjpx4" (OuterVolumeSpecName: "kube-api-access-kjpx4") pod "b32e0459-7aee-4841-8281-da334fe3e8d8" (UID: "b32e0459-7aee-4841-8281-da334fe3e8d8"). InnerVolumeSpecName "kube-api-access-kjpx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.863099 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b32e0459-7aee-4841-8281-da334fe3e8d8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b32e0459-7aee-4841-8281-da334fe3e8d8" (UID: "b32e0459-7aee-4841-8281-da334fe3e8d8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.869688 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b32e0459-7aee-4841-8281-da334fe3e8d8-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b32e0459-7aee-4841-8281-da334fe3e8d8" (UID: "b32e0459-7aee-4841-8281-da334fe3e8d8"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.872086 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b32e0459-7aee-4841-8281-da334fe3e8d8-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "b32e0459-7aee-4841-8281-da334fe3e8d8" (UID: "b32e0459-7aee-4841-8281-da334fe3e8d8"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.895156 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b32e0459-7aee-4841-8281-da334fe3e8d8-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b32e0459-7aee-4841-8281-da334fe3e8d8" (UID: "b32e0459-7aee-4841-8281-da334fe3e8d8"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.928506 4952 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b32e0459-7aee-4841-8281-da334fe3e8d8-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.928786 4952 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b32e0459-7aee-4841-8281-da334fe3e8d8-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.928811 4952 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b32e0459-7aee-4841-8281-da334fe3e8d8-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.928821 4952 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b32e0459-7aee-4841-8281-da334fe3e8d8-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.928834 4952 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b32e0459-7aee-4841-8281-da334fe3e8d8-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.928867 4952 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.928876 4952 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b32e0459-7aee-4841-8281-da334fe3e8d8-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.928885 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjpx4\" (UniqueName: \"kubernetes.io/projected/b32e0459-7aee-4841-8281-da334fe3e8d8-kube-api-access-kjpx4\") on node \"crc\" DevicePath \"\"" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.928895 4952 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b32e0459-7aee-4841-8281-da334fe3e8d8-ca-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.945413 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b32e0459-7aee-4841-8281-da334fe3e8d8","Type":"ContainerDied","Data":"57c42742ad191227ab4ae8d29b85a2a7068f2524ff28dcf2a47fe6e7988a6809"} Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.945454 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57c42742ad191227ab4ae8d29b85a2a7068f2524ff28dcf2a47fe6e7988a6809" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.945513 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 22 04:23:58 crc kubenswrapper[4952]: I1122 04:23:58.951313 4952 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Nov 22 04:23:59 crc kubenswrapper[4952]: I1122 04:23:59.030810 4952 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:01 crc kubenswrapper[4952]: I1122 04:24:01.580870 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 22 04:24:01 crc kubenswrapper[4952]: E1122 04:24:01.582562 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c929ddf-cbd3-4b70-a8cd-6151088a2f30" containerName="extract-content" Nov 22 04:24:01 crc kubenswrapper[4952]: I1122 04:24:01.582583 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c929ddf-cbd3-4b70-a8cd-6151088a2f30" containerName="extract-content" Nov 22 04:24:01 crc kubenswrapper[4952]: E1122 04:24:01.582624 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32e0459-7aee-4841-8281-da334fe3e8d8" containerName="tempest-tests-tempest-tests-runner" Nov 22 04:24:01 crc kubenswrapper[4952]: I1122 04:24:01.582636 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32e0459-7aee-4841-8281-da334fe3e8d8" containerName="tempest-tests-tempest-tests-runner" Nov 22 04:24:01 crc kubenswrapper[4952]: E1122 04:24:01.582654 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c929ddf-cbd3-4b70-a8cd-6151088a2f30" containerName="registry-server" Nov 22 04:24:01 crc kubenswrapper[4952]: I1122 04:24:01.582668 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c929ddf-cbd3-4b70-a8cd-6151088a2f30" containerName="registry-server" Nov 22 04:24:01 crc kubenswrapper[4952]: E1122 04:24:01.582682 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c929ddf-cbd3-4b70-a8cd-6151088a2f30" containerName="extract-utilities" Nov 22 04:24:01 crc kubenswrapper[4952]: I1122 04:24:01.582691 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c929ddf-cbd3-4b70-a8cd-6151088a2f30" containerName="extract-utilities" Nov 22 04:24:01 crc kubenswrapper[4952]: I1122 04:24:01.583009 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="b32e0459-7aee-4841-8281-da334fe3e8d8" containerName="tempest-tests-tempest-tests-runner" Nov 22 04:24:01 crc kubenswrapper[4952]: I1122 04:24:01.583056 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c929ddf-cbd3-4b70-a8cd-6151088a2f30" containerName="registry-server" Nov 22 04:24:01 crc kubenswrapper[4952]: I1122 04:24:01.584292 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 04:24:01 crc kubenswrapper[4952]: I1122 04:24:01.591775 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-z7k5c" Nov 22 04:24:01 crc kubenswrapper[4952]: I1122 04:24:01.602100 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 22 04:24:01 crc kubenswrapper[4952]: I1122 04:24:01.681471 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4eb377f-8d5f-44e6-b719-2374703359e3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 04:24:01 crc kubenswrapper[4952]: I1122 04:24:01.681581 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ng8z\" (UniqueName: \"kubernetes.io/projected/e4eb377f-8d5f-44e6-b719-2374703359e3-kube-api-access-7ng8z\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4eb377f-8d5f-44e6-b719-2374703359e3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 04:24:01 crc kubenswrapper[4952]: I1122 04:24:01.783538 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4eb377f-8d5f-44e6-b719-2374703359e3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 04:24:01 crc kubenswrapper[4952]: I1122 04:24:01.783699 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ng8z\" (UniqueName: \"kubernetes.io/projected/e4eb377f-8d5f-44e6-b719-2374703359e3-kube-api-access-7ng8z\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4eb377f-8d5f-44e6-b719-2374703359e3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 04:24:01 crc kubenswrapper[4952]: I1122 04:24:01.784273 4952 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4eb377f-8d5f-44e6-b719-2374703359e3\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 04:24:01 crc kubenswrapper[4952]: I1122 04:24:01.820677 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ng8z\" (UniqueName: \"kubernetes.io/projected/e4eb377f-8d5f-44e6-b719-2374703359e3-kube-api-access-7ng8z\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4eb377f-8d5f-44e6-b719-2374703359e3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 04:24:01 crc kubenswrapper[4952]: I1122 04:24:01.833507 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4eb377f-8d5f-44e6-b719-2374703359e3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 04:24:01 crc kubenswrapper[4952]: I1122 04:24:01.917126 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 04:24:02 crc kubenswrapper[4952]: I1122 04:24:02.481137 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 22 04:24:02 crc kubenswrapper[4952]: I1122 04:24:02.978984 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e4eb377f-8d5f-44e6-b719-2374703359e3","Type":"ContainerStarted","Data":"b8cd42b12a101ba9c41938fd8a81d5952f64aa81bb0a1562fb1832bcac36f119"} Nov 22 04:24:05 crc kubenswrapper[4952]: I1122 04:24:04.999880 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e4eb377f-8d5f-44e6-b719-2374703359e3","Type":"ContainerStarted","Data":"82e2af84d8702b2f1426cb3ccd1d534f3b08bbff498449f52f11264de75b3475"} Nov 22 04:24:05 crc kubenswrapper[4952]: I1122 04:24:05.020953 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.565043562 podStartE2EDuration="4.020930223s" podCreationTimestamp="2025-11-22 04:24:01 +0000 UTC" firstStartedPulling="2025-11-22 04:24:02.513745628 +0000 UTC m=+5406.819762941" lastFinishedPulling="2025-11-22 04:24:03.969632319 +0000 UTC m=+5408.275649602" observedRunningTime="2025-11-22 04:24:05.018856138 +0000 UTC m=+5409.324873481" watchObservedRunningTime="2025-11-22 04:24:05.020930223 +0000 UTC m=+5409.326947526" Nov 22 04:24:28 crc kubenswrapper[4952]: I1122 04:24:28.341793 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:24:28 crc kubenswrapper[4952]: I1122 04:24:28.342354 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:24:36 crc kubenswrapper[4952]: I1122 04:24:36.239855 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jjcqx/must-gather-pt4g2"] Nov 22 04:24:36 crc kubenswrapper[4952]: I1122 04:24:36.242269 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjcqx/must-gather-pt4g2" Nov 22 04:24:36 crc kubenswrapper[4952]: I1122 04:24:36.244243 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jjcqx"/"default-dockercfg-vcss7" Nov 22 04:24:36 crc kubenswrapper[4952]: I1122 04:24:36.245169 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jjcqx"/"openshift-service-ca.crt" Nov 22 04:24:36 crc kubenswrapper[4952]: I1122 04:24:36.245313 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jjcqx"/"kube-root-ca.crt" Nov 22 04:24:36 crc kubenswrapper[4952]: I1122 04:24:36.259578 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jjcqx/must-gather-pt4g2"] Nov 22 04:24:36 crc kubenswrapper[4952]: I1122 04:24:36.275130 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/726bf479-478d-4230-965e-78041e86ad1f-must-gather-output\") pod \"must-gather-pt4g2\" (UID: \"726bf479-478d-4230-965e-78041e86ad1f\") " pod="openshift-must-gather-jjcqx/must-gather-pt4g2" Nov 22 04:24:36 crc kubenswrapper[4952]: I1122 04:24:36.275312 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbcq9\" (UniqueName: \"kubernetes.io/projected/726bf479-478d-4230-965e-78041e86ad1f-kube-api-access-vbcq9\") pod \"must-gather-pt4g2\" (UID: \"726bf479-478d-4230-965e-78041e86ad1f\") " pod="openshift-must-gather-jjcqx/must-gather-pt4g2" Nov 22 04:24:36 crc kubenswrapper[4952]: I1122 04:24:36.377049 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbcq9\" (UniqueName: \"kubernetes.io/projected/726bf479-478d-4230-965e-78041e86ad1f-kube-api-access-vbcq9\") pod \"must-gather-pt4g2\" (UID: \"726bf479-478d-4230-965e-78041e86ad1f\") " pod="openshift-must-gather-jjcqx/must-gather-pt4g2" Nov 22 04:24:36 crc kubenswrapper[4952]: I1122 04:24:36.377171 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/726bf479-478d-4230-965e-78041e86ad1f-must-gather-output\") pod \"must-gather-pt4g2\" (UID: \"726bf479-478d-4230-965e-78041e86ad1f\") " pod="openshift-must-gather-jjcqx/must-gather-pt4g2" Nov 22 04:24:36 crc kubenswrapper[4952]: I1122 04:24:36.377654 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/726bf479-478d-4230-965e-78041e86ad1f-must-gather-output\") pod \"must-gather-pt4g2\" (UID: \"726bf479-478d-4230-965e-78041e86ad1f\") " pod="openshift-must-gather-jjcqx/must-gather-pt4g2" Nov 22 04:24:36 crc kubenswrapper[4952]: I1122 04:24:36.401712 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbcq9\" (UniqueName: \"kubernetes.io/projected/726bf479-478d-4230-965e-78041e86ad1f-kube-api-access-vbcq9\") pod \"must-gather-pt4g2\" (UID: \"726bf479-478d-4230-965e-78041e86ad1f\") " pod="openshift-must-gather-jjcqx/must-gather-pt4g2" Nov 22 04:24:36 crc kubenswrapper[4952]: I1122 04:24:36.572971 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjcqx/must-gather-pt4g2" Nov 22 04:24:37 crc kubenswrapper[4952]: I1122 04:24:37.048382 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jjcqx/must-gather-pt4g2"] Nov 22 04:24:37 crc kubenswrapper[4952]: W1122 04:24:37.062879 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod726bf479_478d_4230_965e_78041e86ad1f.slice/crio-cdd27b8b63334d115556d27135c09c89768aa808161d2984b4e1bb7804c4595e WatchSource:0}: Error finding container cdd27b8b63334d115556d27135c09c89768aa808161d2984b4e1bb7804c4595e: Status 404 returned error can't find the container with id cdd27b8b63334d115556d27135c09c89768aa808161d2984b4e1bb7804c4595e Nov 22 04:24:37 crc kubenswrapper[4952]: I1122 04:24:37.368394 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjcqx/must-gather-pt4g2" event={"ID":"726bf479-478d-4230-965e-78041e86ad1f","Type":"ContainerStarted","Data":"cdd27b8b63334d115556d27135c09c89768aa808161d2984b4e1bb7804c4595e"} Nov 22 04:24:44 crc kubenswrapper[4952]: I1122 04:24:44.440877 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjcqx/must-gather-pt4g2" event={"ID":"726bf479-478d-4230-965e-78041e86ad1f","Type":"ContainerStarted","Data":"1a864d4a9881e624d7b6df9a55388c1d8f0f11e4f03f12f7ecda308b03fde033"} Nov 22 04:24:44 crc kubenswrapper[4952]: I1122 04:24:44.441373 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjcqx/must-gather-pt4g2" event={"ID":"726bf479-478d-4230-965e-78041e86ad1f","Type":"ContainerStarted","Data":"6d65902ba02bef3280d837f104c790f7a1e8c1a43bf7bf0ae55fb3d54ffab97a"} Nov 22 04:24:44 crc kubenswrapper[4952]: I1122 04:24:44.471937 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jjcqx/must-gather-pt4g2" podStartSLOduration=1.779933712 podStartE2EDuration="8.471919625s" podCreationTimestamp="2025-11-22 04:24:36 +0000 UTC" firstStartedPulling="2025-11-22 04:24:37.06883573 +0000 UTC m=+5441.374853013" lastFinishedPulling="2025-11-22 04:24:43.760821613 +0000 UTC m=+5448.066838926" observedRunningTime="2025-11-22 04:24:44.463183333 +0000 UTC m=+5448.769200626" watchObservedRunningTime="2025-11-22 04:24:44.471919625 +0000 UTC m=+5448.777936898" Nov 22 04:24:53 crc kubenswrapper[4952]: I1122 04:24:53.443192 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jjcqx/crc-debug-cl79k"] Nov 22 04:24:53 crc kubenswrapper[4952]: I1122 04:24:53.445083 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjcqx/crc-debug-cl79k" Nov 22 04:24:53 crc kubenswrapper[4952]: I1122 04:24:53.542315 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ed392a8-7ded-47dc-b2f4-e21e3bed8769-host\") pod \"crc-debug-cl79k\" (UID: \"3ed392a8-7ded-47dc-b2f4-e21e3bed8769\") " pod="openshift-must-gather-jjcqx/crc-debug-cl79k" Nov 22 04:24:53 crc kubenswrapper[4952]: I1122 04:24:53.542877 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnxzv\" (UniqueName: \"kubernetes.io/projected/3ed392a8-7ded-47dc-b2f4-e21e3bed8769-kube-api-access-nnxzv\") pod \"crc-debug-cl79k\" (UID: \"3ed392a8-7ded-47dc-b2f4-e21e3bed8769\") " pod="openshift-must-gather-jjcqx/crc-debug-cl79k" Nov 22 04:24:53 crc kubenswrapper[4952]: I1122 04:24:53.644750 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ed392a8-7ded-47dc-b2f4-e21e3bed8769-host\") pod \"crc-debug-cl79k\" (UID: \"3ed392a8-7ded-47dc-b2f4-e21e3bed8769\") " pod="openshift-must-gather-jjcqx/crc-debug-cl79k" Nov 22 04:24:53 crc kubenswrapper[4952]: I1122 04:24:53.644933 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ed392a8-7ded-47dc-b2f4-e21e3bed8769-host\") pod \"crc-debug-cl79k\" (UID: \"3ed392a8-7ded-47dc-b2f4-e21e3bed8769\") " pod="openshift-must-gather-jjcqx/crc-debug-cl79k" Nov 22 04:24:53 crc kubenswrapper[4952]: I1122 04:24:53.645203 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnxzv\" (UniqueName: \"kubernetes.io/projected/3ed392a8-7ded-47dc-b2f4-e21e3bed8769-kube-api-access-nnxzv\") pod \"crc-debug-cl79k\" (UID: \"3ed392a8-7ded-47dc-b2f4-e21e3bed8769\") " pod="openshift-must-gather-jjcqx/crc-debug-cl79k" Nov 22 04:24:53 crc kubenswrapper[4952]: I1122 04:24:53.667506 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnxzv\" (UniqueName: \"kubernetes.io/projected/3ed392a8-7ded-47dc-b2f4-e21e3bed8769-kube-api-access-nnxzv\") pod \"crc-debug-cl79k\" (UID: \"3ed392a8-7ded-47dc-b2f4-e21e3bed8769\") " pod="openshift-must-gather-jjcqx/crc-debug-cl79k" Nov 22 04:24:53 crc kubenswrapper[4952]: I1122 04:24:53.764572 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjcqx/crc-debug-cl79k" Nov 22 04:24:54 crc kubenswrapper[4952]: I1122 04:24:54.543246 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjcqx/crc-debug-cl79k" event={"ID":"3ed392a8-7ded-47dc-b2f4-e21e3bed8769","Type":"ContainerStarted","Data":"a09cef14f3f99ad98ac3422a98dd79be253aff9e1dbf6c2c7ca6e22f88b11de7"} Nov 22 04:24:58 crc kubenswrapper[4952]: I1122 04:24:58.342125 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:24:58 crc kubenswrapper[4952]: I1122 04:24:58.342800 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:24:58 crc kubenswrapper[4952]: I1122 04:24:58.342860 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 04:24:58 crc kubenswrapper[4952]: I1122 04:24:58.343459 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c89d348b574dc051cb51adc67db44a5be6af54aed9490de1b1cb04b258cd8eff"} pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:24:58 crc kubenswrapper[4952]: I1122 04:24:58.343516 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" containerID="cri-o://c89d348b574dc051cb51adc67db44a5be6af54aed9490de1b1cb04b258cd8eff" gracePeriod=600 Nov 22 04:24:58 crc kubenswrapper[4952]: I1122 04:24:58.570107 4952 generic.go:334] "Generic (PLEG): container finished" podID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerID="c89d348b574dc051cb51adc67db44a5be6af54aed9490de1b1cb04b258cd8eff" exitCode=0 Nov 22 04:24:58 crc kubenswrapper[4952]: I1122 04:24:58.570263 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerDied","Data":"c89d348b574dc051cb51adc67db44a5be6af54aed9490de1b1cb04b258cd8eff"} Nov 22 04:24:58 crc kubenswrapper[4952]: I1122 04:24:58.570420 4952 scope.go:117] "RemoveContainer" containerID="20174f56f7add2893b18c524a222c45520930e21438c3e0da8c9dba4680efc05" Nov 22 04:25:05 crc kubenswrapper[4952]: I1122 04:25:05.668303 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398"} Nov 22 04:25:05 crc kubenswrapper[4952]: I1122 04:25:05.672367 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjcqx/crc-debug-cl79k" event={"ID":"3ed392a8-7ded-47dc-b2f4-e21e3bed8769","Type":"ContainerStarted","Data":"430287ee93ec7f5539503a592fe0449f32e3847b09aefd7c08fdfc9e7db94266"} Nov 22 04:25:05 crc kubenswrapper[4952]: I1122 04:25:05.714992 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jjcqx/crc-debug-cl79k" podStartSLOduration=2.106177329 podStartE2EDuration="12.714969106s" podCreationTimestamp="2025-11-22 04:24:53 +0000 UTC" firstStartedPulling="2025-11-22 04:24:53.822005148 +0000 UTC m=+5458.128022411" lastFinishedPulling="2025-11-22 04:25:04.430796875 +0000 UTC m=+5468.736814188" observedRunningTime="2025-11-22 04:25:05.708618317 +0000 UTC m=+5470.014635590" watchObservedRunningTime="2025-11-22 04:25:05.714969106 +0000 UTC m=+5470.020986389" Nov 22 04:25:23 crc kubenswrapper[4952]: I1122 04:25:23.259479 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fgp2d"] Nov 22 04:25:23 crc kubenswrapper[4952]: I1122 04:25:23.261892 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fgp2d" Nov 22 04:25:23 crc kubenswrapper[4952]: I1122 04:25:23.281368 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fgp2d"] Nov 22 04:25:23 crc kubenswrapper[4952]: I1122 04:25:23.385959 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76c31cfb-9b81-404a-b7e1-abb5a30aa82f-catalog-content\") pod \"redhat-operators-fgp2d\" (UID: \"76c31cfb-9b81-404a-b7e1-abb5a30aa82f\") " pod="openshift-marketplace/redhat-operators-fgp2d" Nov 22 04:25:23 crc kubenswrapper[4952]: I1122 04:25:23.386231 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j96j\" (UniqueName: \"kubernetes.io/projected/76c31cfb-9b81-404a-b7e1-abb5a30aa82f-kube-api-access-7j96j\") pod \"redhat-operators-fgp2d\" (UID: \"76c31cfb-9b81-404a-b7e1-abb5a30aa82f\") " pod="openshift-marketplace/redhat-operators-fgp2d" Nov 22 04:25:23 crc kubenswrapper[4952]: I1122 04:25:23.386271 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76c31cfb-9b81-404a-b7e1-abb5a30aa82f-utilities\") pod \"redhat-operators-fgp2d\" (UID: \"76c31cfb-9b81-404a-b7e1-abb5a30aa82f\") " pod="openshift-marketplace/redhat-operators-fgp2d" Nov 22 04:25:23 crc kubenswrapper[4952]: I1122 04:25:23.488283 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76c31cfb-9b81-404a-b7e1-abb5a30aa82f-catalog-content\") pod \"redhat-operators-fgp2d\" (UID: \"76c31cfb-9b81-404a-b7e1-abb5a30aa82f\") " pod="openshift-marketplace/redhat-operators-fgp2d" Nov 22 04:25:23 crc kubenswrapper[4952]: I1122 04:25:23.488362 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j96j\" (UniqueName: \"kubernetes.io/projected/76c31cfb-9b81-404a-b7e1-abb5a30aa82f-kube-api-access-7j96j\") pod \"redhat-operators-fgp2d\" (UID: \"76c31cfb-9b81-404a-b7e1-abb5a30aa82f\") " pod="openshift-marketplace/redhat-operators-fgp2d" Nov 22 04:25:23 crc kubenswrapper[4952]: I1122 04:25:23.488414 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76c31cfb-9b81-404a-b7e1-abb5a30aa82f-utilities\") pod \"redhat-operators-fgp2d\" (UID: \"76c31cfb-9b81-404a-b7e1-abb5a30aa82f\") " pod="openshift-marketplace/redhat-operators-fgp2d" Nov 22 04:25:23 crc kubenswrapper[4952]: I1122 04:25:23.489202 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76c31cfb-9b81-404a-b7e1-abb5a30aa82f-utilities\") pod \"redhat-operators-fgp2d\" (UID: \"76c31cfb-9b81-404a-b7e1-abb5a30aa82f\") " pod="openshift-marketplace/redhat-operators-fgp2d" Nov 22 04:25:23 crc kubenswrapper[4952]: I1122 04:25:23.489301 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76c31cfb-9b81-404a-b7e1-abb5a30aa82f-catalog-content\") pod \"redhat-operators-fgp2d\" (UID: \"76c31cfb-9b81-404a-b7e1-abb5a30aa82f\") " pod="openshift-marketplace/redhat-operators-fgp2d" Nov 22 04:25:23 crc kubenswrapper[4952]: I1122 04:25:23.510781 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j96j\" (UniqueName: \"kubernetes.io/projected/76c31cfb-9b81-404a-b7e1-abb5a30aa82f-kube-api-access-7j96j\") pod \"redhat-operators-fgp2d\" (UID: \"76c31cfb-9b81-404a-b7e1-abb5a30aa82f\") " pod="openshift-marketplace/redhat-operators-fgp2d" Nov 22 04:25:23 crc kubenswrapper[4952]: I1122 04:25:23.582258 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fgp2d" Nov 22 04:25:24 crc kubenswrapper[4952]: I1122 04:25:24.579197 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fgp2d"] Nov 22 04:25:24 crc kubenswrapper[4952]: I1122 04:25:24.852096 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgp2d" event={"ID":"76c31cfb-9b81-404a-b7e1-abb5a30aa82f","Type":"ContainerStarted","Data":"ab8886346d4b1ea03bb57233a7b768aa769d7dd7fb7d266261eab577709204d4"} Nov 22 04:25:24 crc kubenswrapper[4952]: I1122 04:25:24.852271 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgp2d" event={"ID":"76c31cfb-9b81-404a-b7e1-abb5a30aa82f","Type":"ContainerStarted","Data":"217e77fd9ba7a90e0131c8e9f696a1f20fad44fff923775cc864531f79a49d7b"} Nov 22 04:25:25 crc kubenswrapper[4952]: I1122 04:25:25.873490 4952 generic.go:334] "Generic (PLEG): container finished" podID="76c31cfb-9b81-404a-b7e1-abb5a30aa82f" containerID="ab8886346d4b1ea03bb57233a7b768aa769d7dd7fb7d266261eab577709204d4" exitCode=0 Nov 22 04:25:25 crc kubenswrapper[4952]: I1122 04:25:25.873571 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgp2d" event={"ID":"76c31cfb-9b81-404a-b7e1-abb5a30aa82f","Type":"ContainerDied","Data":"ab8886346d4b1ea03bb57233a7b768aa769d7dd7fb7d266261eab577709204d4"} Nov 22 04:25:28 crc kubenswrapper[4952]: I1122 04:25:28.907970 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgp2d" event={"ID":"76c31cfb-9b81-404a-b7e1-abb5a30aa82f","Type":"ContainerStarted","Data":"000d7f7e4809195f8292ea16f4d62c23f443f6fe8222e21c7d042b5c91db0643"} Nov 22 04:25:33 crc kubenswrapper[4952]: I1122 04:25:33.955387 4952 generic.go:334] "Generic (PLEG): container finished" podID="76c31cfb-9b81-404a-b7e1-abb5a30aa82f" containerID="000d7f7e4809195f8292ea16f4d62c23f443f6fe8222e21c7d042b5c91db0643" exitCode=0 Nov 22 04:25:33 crc kubenswrapper[4952]: I1122 04:25:33.955692 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgp2d" event={"ID":"76c31cfb-9b81-404a-b7e1-abb5a30aa82f","Type":"ContainerDied","Data":"000d7f7e4809195f8292ea16f4d62c23f443f6fe8222e21c7d042b5c91db0643"} Nov 22 04:25:33 crc kubenswrapper[4952]: I1122 04:25:33.963898 4952 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:25:34 crc kubenswrapper[4952]: I1122 04:25:34.968729 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgp2d" event={"ID":"76c31cfb-9b81-404a-b7e1-abb5a30aa82f","Type":"ContainerStarted","Data":"909cb9dcc6350406feef7b7edb29cab8f97270afc7001fe6da50aa0e6b43a5d3"} Nov 22 04:25:43 crc kubenswrapper[4952]: I1122 04:25:43.583411 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fgp2d" Nov 22 04:25:43 crc kubenswrapper[4952]: I1122 04:25:43.583858 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fgp2d" Nov 22 04:25:43 crc kubenswrapper[4952]: I1122 04:25:43.639321 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fgp2d" Nov 22 04:25:43 crc kubenswrapper[4952]: I1122 04:25:43.663800 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fgp2d" podStartSLOduration=10.89229881 podStartE2EDuration="20.663774947s" podCreationTimestamp="2025-11-22 04:25:23 +0000 UTC" firstStartedPulling="2025-11-22 04:25:24.857495377 +0000 UTC m=+5489.163512650" lastFinishedPulling="2025-11-22 04:25:34.628971514 +0000 UTC m=+5498.934988787" observedRunningTime="2025-11-22 04:25:34.996067672 +0000 UTC m=+5499.302084955" watchObservedRunningTime="2025-11-22 04:25:43.663774947 +0000 UTC m=+5507.969792230" Nov 22 04:25:44 crc kubenswrapper[4952]: I1122 04:25:44.108829 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fgp2d" Nov 22 04:25:44 crc kubenswrapper[4952]: I1122 04:25:44.164279 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fgp2d"] Nov 22 04:25:46 crc kubenswrapper[4952]: I1122 04:25:46.074934 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fgp2d" podUID="76c31cfb-9b81-404a-b7e1-abb5a30aa82f" containerName="registry-server" containerID="cri-o://909cb9dcc6350406feef7b7edb29cab8f97270afc7001fe6da50aa0e6b43a5d3" gracePeriod=2 Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.027364 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fgp2d" Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.088054 4952 generic.go:334] "Generic (PLEG): container finished" podID="76c31cfb-9b81-404a-b7e1-abb5a30aa82f" containerID="909cb9dcc6350406feef7b7edb29cab8f97270afc7001fe6da50aa0e6b43a5d3" exitCode=0 Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.088139 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgp2d" event={"ID":"76c31cfb-9b81-404a-b7e1-abb5a30aa82f","Type":"ContainerDied","Data":"909cb9dcc6350406feef7b7edb29cab8f97270afc7001fe6da50aa0e6b43a5d3"} Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.088173 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fgp2d" event={"ID":"76c31cfb-9b81-404a-b7e1-abb5a30aa82f","Type":"ContainerDied","Data":"217e77fd9ba7a90e0131c8e9f696a1f20fad44fff923775cc864531f79a49d7b"} Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.088195 4952 scope.go:117] "RemoveContainer" containerID="909cb9dcc6350406feef7b7edb29cab8f97270afc7001fe6da50aa0e6b43a5d3" Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.088475 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fgp2d" Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.111743 4952 scope.go:117] "RemoveContainer" containerID="000d7f7e4809195f8292ea16f4d62c23f443f6fe8222e21c7d042b5c91db0643" Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.134239 4952 scope.go:117] "RemoveContainer" containerID="ab8886346d4b1ea03bb57233a7b768aa769d7dd7fb7d266261eab577709204d4" Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.177131 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j96j\" (UniqueName: \"kubernetes.io/projected/76c31cfb-9b81-404a-b7e1-abb5a30aa82f-kube-api-access-7j96j\") pod \"76c31cfb-9b81-404a-b7e1-abb5a30aa82f\" (UID: \"76c31cfb-9b81-404a-b7e1-abb5a30aa82f\") " Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.177392 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76c31cfb-9b81-404a-b7e1-abb5a30aa82f-utilities\") pod \"76c31cfb-9b81-404a-b7e1-abb5a30aa82f\" (UID: \"76c31cfb-9b81-404a-b7e1-abb5a30aa82f\") " Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.177506 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76c31cfb-9b81-404a-b7e1-abb5a30aa82f-catalog-content\") pod \"76c31cfb-9b81-404a-b7e1-abb5a30aa82f\" (UID: \"76c31cfb-9b81-404a-b7e1-abb5a30aa82f\") " Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.178839 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76c31cfb-9b81-404a-b7e1-abb5a30aa82f-utilities" (OuterVolumeSpecName: "utilities") pod "76c31cfb-9b81-404a-b7e1-abb5a30aa82f" (UID: "76c31cfb-9b81-404a-b7e1-abb5a30aa82f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.228346 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76c31cfb-9b81-404a-b7e1-abb5a30aa82f-kube-api-access-7j96j" (OuterVolumeSpecName: "kube-api-access-7j96j") pod "76c31cfb-9b81-404a-b7e1-abb5a30aa82f" (UID: "76c31cfb-9b81-404a-b7e1-abb5a30aa82f"). InnerVolumeSpecName "kube-api-access-7j96j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.243128 4952 scope.go:117] "RemoveContainer" containerID="909cb9dcc6350406feef7b7edb29cab8f97270afc7001fe6da50aa0e6b43a5d3" Nov 22 04:25:47 crc kubenswrapper[4952]: E1122 04:25:47.243792 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"909cb9dcc6350406feef7b7edb29cab8f97270afc7001fe6da50aa0e6b43a5d3\": container with ID starting with 909cb9dcc6350406feef7b7edb29cab8f97270afc7001fe6da50aa0e6b43a5d3 not found: ID does not exist" containerID="909cb9dcc6350406feef7b7edb29cab8f97270afc7001fe6da50aa0e6b43a5d3" Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.243846 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"909cb9dcc6350406feef7b7edb29cab8f97270afc7001fe6da50aa0e6b43a5d3"} err="failed to get container status \"909cb9dcc6350406feef7b7edb29cab8f97270afc7001fe6da50aa0e6b43a5d3\": rpc error: code = NotFound desc = could not find container \"909cb9dcc6350406feef7b7edb29cab8f97270afc7001fe6da50aa0e6b43a5d3\": container with ID starting with 909cb9dcc6350406feef7b7edb29cab8f97270afc7001fe6da50aa0e6b43a5d3 not found: ID does not exist" Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.243880 4952 scope.go:117] "RemoveContainer" containerID="000d7f7e4809195f8292ea16f4d62c23f443f6fe8222e21c7d042b5c91db0643" Nov 22 04:25:47 crc kubenswrapper[4952]: E1122 04:25:47.244677 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"000d7f7e4809195f8292ea16f4d62c23f443f6fe8222e21c7d042b5c91db0643\": container with ID starting with 000d7f7e4809195f8292ea16f4d62c23f443f6fe8222e21c7d042b5c91db0643 not found: ID does not exist" containerID="000d7f7e4809195f8292ea16f4d62c23f443f6fe8222e21c7d042b5c91db0643" Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.244722 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"000d7f7e4809195f8292ea16f4d62c23f443f6fe8222e21c7d042b5c91db0643"} err="failed to get container status \"000d7f7e4809195f8292ea16f4d62c23f443f6fe8222e21c7d042b5c91db0643\": rpc error: code = NotFound desc = could not find container \"000d7f7e4809195f8292ea16f4d62c23f443f6fe8222e21c7d042b5c91db0643\": container with ID starting with 000d7f7e4809195f8292ea16f4d62c23f443f6fe8222e21c7d042b5c91db0643 not found: ID does not exist" Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.244753 4952 scope.go:117] "RemoveContainer" containerID="ab8886346d4b1ea03bb57233a7b768aa769d7dd7fb7d266261eab577709204d4" Nov 22 04:25:47 crc kubenswrapper[4952]: E1122 04:25:47.245173 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab8886346d4b1ea03bb57233a7b768aa769d7dd7fb7d266261eab577709204d4\": container with ID starting with ab8886346d4b1ea03bb57233a7b768aa769d7dd7fb7d266261eab577709204d4 not found: ID does not exist" containerID="ab8886346d4b1ea03bb57233a7b768aa769d7dd7fb7d266261eab577709204d4" Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.245206 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab8886346d4b1ea03bb57233a7b768aa769d7dd7fb7d266261eab577709204d4"} err="failed to get container status \"ab8886346d4b1ea03bb57233a7b768aa769d7dd7fb7d266261eab577709204d4\": rpc error: code = NotFound desc = could not find container \"ab8886346d4b1ea03bb57233a7b768aa769d7dd7fb7d266261eab577709204d4\": container with ID starting with ab8886346d4b1ea03bb57233a7b768aa769d7dd7fb7d266261eab577709204d4 not found: ID does not exist" Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.279808 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76c31cfb-9b81-404a-b7e1-abb5a30aa82f-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.279851 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j96j\" (UniqueName: \"kubernetes.io/projected/76c31cfb-9b81-404a-b7e1-abb5a30aa82f-kube-api-access-7j96j\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.304194 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76c31cfb-9b81-404a-b7e1-abb5a30aa82f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76c31cfb-9b81-404a-b7e1-abb5a30aa82f" (UID: "76c31cfb-9b81-404a-b7e1-abb5a30aa82f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.381632 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76c31cfb-9b81-404a-b7e1-abb5a30aa82f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.423632 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fgp2d"] Nov 22 04:25:47 crc kubenswrapper[4952]: I1122 04:25:47.431501 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fgp2d"] Nov 22 04:25:48 crc kubenswrapper[4952]: I1122 04:25:48.547913 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76c31cfb-9b81-404a-b7e1-abb5a30aa82f" path="/var/lib/kubelet/pods/76c31cfb-9b81-404a-b7e1-abb5a30aa82f/volumes" Nov 22 04:26:02 crc kubenswrapper[4952]: I1122 04:26:02.279303 4952 generic.go:334] "Generic (PLEG): container finished" podID="3ed392a8-7ded-47dc-b2f4-e21e3bed8769" containerID="430287ee93ec7f5539503a592fe0449f32e3847b09aefd7c08fdfc9e7db94266" exitCode=0 Nov 22 04:26:02 crc kubenswrapper[4952]: I1122 04:26:02.279531 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjcqx/crc-debug-cl79k" event={"ID":"3ed392a8-7ded-47dc-b2f4-e21e3bed8769","Type":"ContainerDied","Data":"430287ee93ec7f5539503a592fe0449f32e3847b09aefd7c08fdfc9e7db94266"} Nov 22 04:26:04 crc kubenswrapper[4952]: I1122 04:26:04.307222 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjcqx/crc-debug-cl79k" event={"ID":"3ed392a8-7ded-47dc-b2f4-e21e3bed8769","Type":"ContainerDied","Data":"a09cef14f3f99ad98ac3422a98dd79be253aff9e1dbf6c2c7ca6e22f88b11de7"} Nov 22 04:26:04 crc kubenswrapper[4952]: I1122 04:26:04.307926 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a09cef14f3f99ad98ac3422a98dd79be253aff9e1dbf6c2c7ca6e22f88b11de7" Nov 22 04:26:04 crc kubenswrapper[4952]: I1122 04:26:04.336336 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjcqx/crc-debug-cl79k" Nov 22 04:26:04 crc kubenswrapper[4952]: I1122 04:26:04.378894 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jjcqx/crc-debug-cl79k"] Nov 22 04:26:04 crc kubenswrapper[4952]: I1122 04:26:04.389212 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jjcqx/crc-debug-cl79k"] Nov 22 04:26:04 crc kubenswrapper[4952]: I1122 04:26:04.403539 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnxzv\" (UniqueName: \"kubernetes.io/projected/3ed392a8-7ded-47dc-b2f4-e21e3bed8769-kube-api-access-nnxzv\") pod \"3ed392a8-7ded-47dc-b2f4-e21e3bed8769\" (UID: \"3ed392a8-7ded-47dc-b2f4-e21e3bed8769\") " Nov 22 04:26:04 crc kubenswrapper[4952]: I1122 04:26:04.403944 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ed392a8-7ded-47dc-b2f4-e21e3bed8769-host\") pod \"3ed392a8-7ded-47dc-b2f4-e21e3bed8769\" (UID: \"3ed392a8-7ded-47dc-b2f4-e21e3bed8769\") " Nov 22 04:26:04 crc kubenswrapper[4952]: I1122 04:26:04.404032 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ed392a8-7ded-47dc-b2f4-e21e3bed8769-host" (OuterVolumeSpecName: "host") pod "3ed392a8-7ded-47dc-b2f4-e21e3bed8769" (UID: "3ed392a8-7ded-47dc-b2f4-e21e3bed8769"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:26:04 crc kubenswrapper[4952]: I1122 04:26:04.404824 4952 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ed392a8-7ded-47dc-b2f4-e21e3bed8769-host\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:04 crc kubenswrapper[4952]: I1122 04:26:04.409366 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ed392a8-7ded-47dc-b2f4-e21e3bed8769-kube-api-access-nnxzv" (OuterVolumeSpecName: "kube-api-access-nnxzv") pod "3ed392a8-7ded-47dc-b2f4-e21e3bed8769" (UID: "3ed392a8-7ded-47dc-b2f4-e21e3bed8769"). InnerVolumeSpecName "kube-api-access-nnxzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:26:04 crc kubenswrapper[4952]: I1122 04:26:04.507292 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnxzv\" (UniqueName: \"kubernetes.io/projected/3ed392a8-7ded-47dc-b2f4-e21e3bed8769-kube-api-access-nnxzv\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:04 crc kubenswrapper[4952]: I1122 04:26:04.543048 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ed392a8-7ded-47dc-b2f4-e21e3bed8769" path="/var/lib/kubelet/pods/3ed392a8-7ded-47dc-b2f4-e21e3bed8769/volumes" Nov 22 04:26:05 crc kubenswrapper[4952]: I1122 04:26:05.318062 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjcqx/crc-debug-cl79k" Nov 22 04:26:05 crc kubenswrapper[4952]: I1122 04:26:05.641378 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jjcqx/crc-debug-9clh7"] Nov 22 04:26:05 crc kubenswrapper[4952]: E1122 04:26:05.642265 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed392a8-7ded-47dc-b2f4-e21e3bed8769" containerName="container-00" Nov 22 04:26:05 crc kubenswrapper[4952]: I1122 04:26:05.642334 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed392a8-7ded-47dc-b2f4-e21e3bed8769" containerName="container-00" Nov 22 04:26:05 crc kubenswrapper[4952]: E1122 04:26:05.642400 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76c31cfb-9b81-404a-b7e1-abb5a30aa82f" containerName="extract-utilities" Nov 22 04:26:05 crc kubenswrapper[4952]: I1122 04:26:05.642453 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="76c31cfb-9b81-404a-b7e1-abb5a30aa82f" containerName="extract-utilities" Nov 22 04:26:05 crc kubenswrapper[4952]: E1122 04:26:05.642513 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76c31cfb-9b81-404a-b7e1-abb5a30aa82f" containerName="registry-server" Nov 22 04:26:05 crc kubenswrapper[4952]: I1122 04:26:05.642588 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="76c31cfb-9b81-404a-b7e1-abb5a30aa82f" containerName="registry-server" Nov 22 04:26:05 crc kubenswrapper[4952]: E1122 04:26:05.642664 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76c31cfb-9b81-404a-b7e1-abb5a30aa82f" containerName="extract-content" Nov 22 04:26:05 crc kubenswrapper[4952]: I1122 04:26:05.642717 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="76c31cfb-9b81-404a-b7e1-abb5a30aa82f" containerName="extract-content" Nov 22 04:26:05 crc kubenswrapper[4952]: I1122 04:26:05.642925 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="76c31cfb-9b81-404a-b7e1-abb5a30aa82f" containerName="registry-server" Nov 22 04:26:05 crc kubenswrapper[4952]: I1122 04:26:05.643008 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed392a8-7ded-47dc-b2f4-e21e3bed8769" containerName="container-00" Nov 22 04:26:05 crc kubenswrapper[4952]: I1122 04:26:05.643658 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjcqx/crc-debug-9clh7" Nov 22 04:26:05 crc kubenswrapper[4952]: I1122 04:26:05.728835 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f88b3de1-f046-41bd-85af-52502af4047b-host\") pod \"crc-debug-9clh7\" (UID: \"f88b3de1-f046-41bd-85af-52502af4047b\") " pod="openshift-must-gather-jjcqx/crc-debug-9clh7" Nov 22 04:26:05 crc kubenswrapper[4952]: I1122 04:26:05.729024 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4sfs\" (UniqueName: \"kubernetes.io/projected/f88b3de1-f046-41bd-85af-52502af4047b-kube-api-access-d4sfs\") pod \"crc-debug-9clh7\" (UID: \"f88b3de1-f046-41bd-85af-52502af4047b\") " pod="openshift-must-gather-jjcqx/crc-debug-9clh7" Nov 22 04:26:05 crc kubenswrapper[4952]: I1122 04:26:05.831373 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f88b3de1-f046-41bd-85af-52502af4047b-host\") pod \"crc-debug-9clh7\" (UID: \"f88b3de1-f046-41bd-85af-52502af4047b\") " pod="openshift-must-gather-jjcqx/crc-debug-9clh7" Nov 22 04:26:05 crc kubenswrapper[4952]: I1122 04:26:05.831485 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4sfs\" (UniqueName: \"kubernetes.io/projected/f88b3de1-f046-41bd-85af-52502af4047b-kube-api-access-d4sfs\") pod \"crc-debug-9clh7\" (UID: \"f88b3de1-f046-41bd-85af-52502af4047b\") " pod="openshift-must-gather-jjcqx/crc-debug-9clh7" Nov 22 04:26:05 crc kubenswrapper[4952]: I1122 04:26:05.832022 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f88b3de1-f046-41bd-85af-52502af4047b-host\") pod \"crc-debug-9clh7\" (UID: \"f88b3de1-f046-41bd-85af-52502af4047b\") " pod="openshift-must-gather-jjcqx/crc-debug-9clh7" Nov 22 04:26:05 crc kubenswrapper[4952]: I1122 04:26:05.852617 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4sfs\" (UniqueName: \"kubernetes.io/projected/f88b3de1-f046-41bd-85af-52502af4047b-kube-api-access-d4sfs\") pod \"crc-debug-9clh7\" (UID: \"f88b3de1-f046-41bd-85af-52502af4047b\") " pod="openshift-must-gather-jjcqx/crc-debug-9clh7" Nov 22 04:26:05 crc kubenswrapper[4952]: I1122 04:26:05.967455 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjcqx/crc-debug-9clh7" Nov 22 04:26:06 crc kubenswrapper[4952]: I1122 04:26:06.328179 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjcqx/crc-debug-9clh7" event={"ID":"f88b3de1-f046-41bd-85af-52502af4047b","Type":"ContainerStarted","Data":"66f460845b5f57c5d80cb2410530436869e6e9bd6ab4f5906729159363858484"} Nov 22 04:26:06 crc kubenswrapper[4952]: I1122 04:26:06.328570 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjcqx/crc-debug-9clh7" event={"ID":"f88b3de1-f046-41bd-85af-52502af4047b","Type":"ContainerStarted","Data":"9fcf179c8e481f241ff6747b214ff0f90062d5d3fb953ac68a8f2d9380fb63a4"} Nov 22 04:26:06 crc kubenswrapper[4952]: I1122 04:26:06.345472 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jjcqx/crc-debug-9clh7" podStartSLOduration=1.345450031 podStartE2EDuration="1.345450031s" podCreationTimestamp="2025-11-22 04:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:26:06.339987116 +0000 UTC m=+5530.646004409" watchObservedRunningTime="2025-11-22 04:26:06.345450031 +0000 UTC m=+5530.651467304" Nov 22 04:26:07 crc kubenswrapper[4952]: I1122 04:26:07.337727 4952 generic.go:334] "Generic (PLEG): container finished" podID="f88b3de1-f046-41bd-85af-52502af4047b" containerID="66f460845b5f57c5d80cb2410530436869e6e9bd6ab4f5906729159363858484" exitCode=0 Nov 22 04:26:07 crc kubenswrapper[4952]: I1122 04:26:07.337796 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjcqx/crc-debug-9clh7" event={"ID":"f88b3de1-f046-41bd-85af-52502af4047b","Type":"ContainerDied","Data":"66f460845b5f57c5d80cb2410530436869e6e9bd6ab4f5906729159363858484"} Nov 22 04:26:08 crc kubenswrapper[4952]: I1122 04:26:08.445894 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjcqx/crc-debug-9clh7" Nov 22 04:26:08 crc kubenswrapper[4952]: I1122 04:26:08.574005 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4sfs\" (UniqueName: \"kubernetes.io/projected/f88b3de1-f046-41bd-85af-52502af4047b-kube-api-access-d4sfs\") pod \"f88b3de1-f046-41bd-85af-52502af4047b\" (UID: \"f88b3de1-f046-41bd-85af-52502af4047b\") " Nov 22 04:26:08 crc kubenswrapper[4952]: I1122 04:26:08.574127 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f88b3de1-f046-41bd-85af-52502af4047b-host\") pod \"f88b3de1-f046-41bd-85af-52502af4047b\" (UID: \"f88b3de1-f046-41bd-85af-52502af4047b\") " Nov 22 04:26:08 crc kubenswrapper[4952]: I1122 04:26:08.577738 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f88b3de1-f046-41bd-85af-52502af4047b-host" (OuterVolumeSpecName: "host") pod "f88b3de1-f046-41bd-85af-52502af4047b" (UID: "f88b3de1-f046-41bd-85af-52502af4047b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:26:08 crc kubenswrapper[4952]: I1122 04:26:08.598832 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88b3de1-f046-41bd-85af-52502af4047b-kube-api-access-d4sfs" (OuterVolumeSpecName: "kube-api-access-d4sfs") pod "f88b3de1-f046-41bd-85af-52502af4047b" (UID: "f88b3de1-f046-41bd-85af-52502af4047b"). InnerVolumeSpecName "kube-api-access-d4sfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:26:08 crc kubenswrapper[4952]: I1122 04:26:08.676387 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4sfs\" (UniqueName: \"kubernetes.io/projected/f88b3de1-f046-41bd-85af-52502af4047b-kube-api-access-d4sfs\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:08 crc kubenswrapper[4952]: I1122 04:26:08.676422 4952 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f88b3de1-f046-41bd-85af-52502af4047b-host\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:09 crc kubenswrapper[4952]: I1122 04:26:09.117131 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jjcqx/crc-debug-9clh7"] Nov 22 04:26:09 crc kubenswrapper[4952]: I1122 04:26:09.129063 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jjcqx/crc-debug-9clh7"] Nov 22 04:26:09 crc kubenswrapper[4952]: I1122 04:26:09.355813 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fcf179c8e481f241ff6747b214ff0f90062d5d3fb953ac68a8f2d9380fb63a4" Nov 22 04:26:09 crc kubenswrapper[4952]: I1122 04:26:09.355891 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjcqx/crc-debug-9clh7" Nov 22 04:26:10 crc kubenswrapper[4952]: I1122 04:26:10.290373 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jjcqx/crc-debug-fr5kz"] Nov 22 04:26:10 crc kubenswrapper[4952]: E1122 04:26:10.291029 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f88b3de1-f046-41bd-85af-52502af4047b" containerName="container-00" Nov 22 04:26:10 crc kubenswrapper[4952]: I1122 04:26:10.291051 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="f88b3de1-f046-41bd-85af-52502af4047b" containerName="container-00" Nov 22 04:26:10 crc kubenswrapper[4952]: I1122 04:26:10.291363 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="f88b3de1-f046-41bd-85af-52502af4047b" containerName="container-00" Nov 22 04:26:10 crc kubenswrapper[4952]: I1122 04:26:10.292320 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjcqx/crc-debug-fr5kz" Nov 22 04:26:10 crc kubenswrapper[4952]: I1122 04:26:10.408825 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bef20724-3eba-4fc6-bca8-1095aa64d546-host\") pod \"crc-debug-fr5kz\" (UID: \"bef20724-3eba-4fc6-bca8-1095aa64d546\") " pod="openshift-must-gather-jjcqx/crc-debug-fr5kz" Nov 22 04:26:10 crc kubenswrapper[4952]: I1122 04:26:10.409207 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k4hr\" (UniqueName: \"kubernetes.io/projected/bef20724-3eba-4fc6-bca8-1095aa64d546-kube-api-access-8k4hr\") pod \"crc-debug-fr5kz\" (UID: \"bef20724-3eba-4fc6-bca8-1095aa64d546\") " pod="openshift-must-gather-jjcqx/crc-debug-fr5kz" Nov 22 04:26:10 crc kubenswrapper[4952]: I1122 04:26:10.511595 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bef20724-3eba-4fc6-bca8-1095aa64d546-host\") pod \"crc-debug-fr5kz\" (UID: \"bef20724-3eba-4fc6-bca8-1095aa64d546\") " pod="openshift-must-gather-jjcqx/crc-debug-fr5kz" Nov 22 04:26:10 crc kubenswrapper[4952]: I1122 04:26:10.511779 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k4hr\" (UniqueName: \"kubernetes.io/projected/bef20724-3eba-4fc6-bca8-1095aa64d546-kube-api-access-8k4hr\") pod \"crc-debug-fr5kz\" (UID: \"bef20724-3eba-4fc6-bca8-1095aa64d546\") " pod="openshift-must-gather-jjcqx/crc-debug-fr5kz" Nov 22 04:26:10 crc kubenswrapper[4952]: I1122 04:26:10.512041 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bef20724-3eba-4fc6-bca8-1095aa64d546-host\") pod \"crc-debug-fr5kz\" (UID: \"bef20724-3eba-4fc6-bca8-1095aa64d546\") " pod="openshift-must-gather-jjcqx/crc-debug-fr5kz" Nov 22 04:26:10 crc kubenswrapper[4952]: I1122 04:26:10.532649 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k4hr\" (UniqueName: \"kubernetes.io/projected/bef20724-3eba-4fc6-bca8-1095aa64d546-kube-api-access-8k4hr\") pod \"crc-debug-fr5kz\" (UID: \"bef20724-3eba-4fc6-bca8-1095aa64d546\") " pod="openshift-must-gather-jjcqx/crc-debug-fr5kz" Nov 22 04:26:10 crc kubenswrapper[4952]: I1122 04:26:10.542368 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88b3de1-f046-41bd-85af-52502af4047b" path="/var/lib/kubelet/pods/f88b3de1-f046-41bd-85af-52502af4047b/volumes" Nov 22 04:26:10 crc kubenswrapper[4952]: I1122 04:26:10.613387 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjcqx/crc-debug-fr5kz" Nov 22 04:26:10 crc kubenswrapper[4952]: W1122 04:26:10.654741 4952 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbef20724_3eba_4fc6_bca8_1095aa64d546.slice/crio-346e46a9db713e700c7b3146a1a6526d9f7cfa8a29754cd81380ca08d2ef6c23 WatchSource:0}: Error finding container 346e46a9db713e700c7b3146a1a6526d9f7cfa8a29754cd81380ca08d2ef6c23: Status 404 returned error can't find the container with id 346e46a9db713e700c7b3146a1a6526d9f7cfa8a29754cd81380ca08d2ef6c23 Nov 22 04:26:11 crc kubenswrapper[4952]: I1122 04:26:11.375189 4952 generic.go:334] "Generic (PLEG): container finished" podID="bef20724-3eba-4fc6-bca8-1095aa64d546" containerID="7e3f7bdde5841074500be5eda6151fafc249f37f4fdfac08afe3647e40bcd5a6" exitCode=0 Nov 22 04:26:11 crc kubenswrapper[4952]: I1122 04:26:11.375333 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjcqx/crc-debug-fr5kz" event={"ID":"bef20724-3eba-4fc6-bca8-1095aa64d546","Type":"ContainerDied","Data":"7e3f7bdde5841074500be5eda6151fafc249f37f4fdfac08afe3647e40bcd5a6"} Nov 22 04:26:11 crc kubenswrapper[4952]: I1122 04:26:11.376945 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjcqx/crc-debug-fr5kz" event={"ID":"bef20724-3eba-4fc6-bca8-1095aa64d546","Type":"ContainerStarted","Data":"346e46a9db713e700c7b3146a1a6526d9f7cfa8a29754cd81380ca08d2ef6c23"} Nov 22 04:26:11 crc kubenswrapper[4952]: I1122 04:26:11.426194 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jjcqx/crc-debug-fr5kz"] Nov 22 04:26:11 crc kubenswrapper[4952]: I1122 04:26:11.436117 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jjcqx/crc-debug-fr5kz"] Nov 22 04:26:12 crc kubenswrapper[4952]: I1122 04:26:12.524115 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjcqx/crc-debug-fr5kz" Nov 22 04:26:12 crc kubenswrapper[4952]: I1122 04:26:12.659832 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k4hr\" (UniqueName: \"kubernetes.io/projected/bef20724-3eba-4fc6-bca8-1095aa64d546-kube-api-access-8k4hr\") pod \"bef20724-3eba-4fc6-bca8-1095aa64d546\" (UID: \"bef20724-3eba-4fc6-bca8-1095aa64d546\") " Nov 22 04:26:12 crc kubenswrapper[4952]: I1122 04:26:12.660276 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bef20724-3eba-4fc6-bca8-1095aa64d546-host\") pod \"bef20724-3eba-4fc6-bca8-1095aa64d546\" (UID: \"bef20724-3eba-4fc6-bca8-1095aa64d546\") " Nov 22 04:26:12 crc kubenswrapper[4952]: I1122 04:26:12.660321 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bef20724-3eba-4fc6-bca8-1095aa64d546-host" (OuterVolumeSpecName: "host") pod "bef20724-3eba-4fc6-bca8-1095aa64d546" (UID: "bef20724-3eba-4fc6-bca8-1095aa64d546"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:26:12 crc kubenswrapper[4952]: I1122 04:26:12.661329 4952 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bef20724-3eba-4fc6-bca8-1095aa64d546-host\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:12 crc kubenswrapper[4952]: I1122 04:26:12.666131 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bef20724-3eba-4fc6-bca8-1095aa64d546-kube-api-access-8k4hr" (OuterVolumeSpecName: "kube-api-access-8k4hr") pod "bef20724-3eba-4fc6-bca8-1095aa64d546" (UID: "bef20724-3eba-4fc6-bca8-1095aa64d546"). InnerVolumeSpecName "kube-api-access-8k4hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:26:12 crc kubenswrapper[4952]: I1122 04:26:12.763788 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k4hr\" (UniqueName: \"kubernetes.io/projected/bef20724-3eba-4fc6-bca8-1095aa64d546-kube-api-access-8k4hr\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:13 crc kubenswrapper[4952]: I1122 04:26:13.402604 4952 scope.go:117] "RemoveContainer" containerID="7e3f7bdde5841074500be5eda6151fafc249f37f4fdfac08afe3647e40bcd5a6" Nov 22 04:26:13 crc kubenswrapper[4952]: I1122 04:26:13.402637 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjcqx/crc-debug-fr5kz" Nov 22 04:26:14 crc kubenswrapper[4952]: I1122 04:26:14.545169 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bef20724-3eba-4fc6-bca8-1095aa64d546" path="/var/lib/kubelet/pods/bef20724-3eba-4fc6-bca8-1095aa64d546/volumes" Nov 22 04:26:31 crc kubenswrapper[4952]: I1122 04:26:31.657620 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h2r86"] Nov 22 04:26:31 crc kubenswrapper[4952]: E1122 04:26:31.658659 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef20724-3eba-4fc6-bca8-1095aa64d546" containerName="container-00" Nov 22 04:26:31 crc kubenswrapper[4952]: I1122 04:26:31.658678 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef20724-3eba-4fc6-bca8-1095aa64d546" containerName="container-00" Nov 22 04:26:31 crc kubenswrapper[4952]: I1122 04:26:31.658906 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef20724-3eba-4fc6-bca8-1095aa64d546" containerName="container-00" Nov 22 04:26:31 crc kubenswrapper[4952]: I1122 04:26:31.660534 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h2r86" Nov 22 04:26:31 crc kubenswrapper[4952]: I1122 04:26:31.671162 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h2r86"] Nov 22 04:26:31 crc kubenswrapper[4952]: I1122 04:26:31.755415 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lzbm\" (UniqueName: \"kubernetes.io/projected/2171b86d-e053-4b6d-b456-95f3518b014a-kube-api-access-5lzbm\") pod \"certified-operators-h2r86\" (UID: \"2171b86d-e053-4b6d-b456-95f3518b014a\") " pod="openshift-marketplace/certified-operators-h2r86" Nov 22 04:26:31 crc kubenswrapper[4952]: I1122 04:26:31.755474 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2171b86d-e053-4b6d-b456-95f3518b014a-catalog-content\") pod \"certified-operators-h2r86\" (UID: \"2171b86d-e053-4b6d-b456-95f3518b014a\") " pod="openshift-marketplace/certified-operators-h2r86" Nov 22 04:26:31 crc kubenswrapper[4952]: I1122 04:26:31.755898 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2171b86d-e053-4b6d-b456-95f3518b014a-utilities\") pod \"certified-operators-h2r86\" (UID: \"2171b86d-e053-4b6d-b456-95f3518b014a\") " pod="openshift-marketplace/certified-operators-h2r86" Nov 22 04:26:31 crc kubenswrapper[4952]: I1122 04:26:31.858305 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lzbm\" (UniqueName: \"kubernetes.io/projected/2171b86d-e053-4b6d-b456-95f3518b014a-kube-api-access-5lzbm\") pod \"certified-operators-h2r86\" (UID: \"2171b86d-e053-4b6d-b456-95f3518b014a\") " pod="openshift-marketplace/certified-operators-h2r86" Nov 22 04:26:31 crc kubenswrapper[4952]: I1122 04:26:31.858360 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2171b86d-e053-4b6d-b456-95f3518b014a-catalog-content\") pod \"certified-operators-h2r86\" (UID: \"2171b86d-e053-4b6d-b456-95f3518b014a\") " pod="openshift-marketplace/certified-operators-h2r86" Nov 22 04:26:31 crc kubenswrapper[4952]: I1122 04:26:31.858486 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2171b86d-e053-4b6d-b456-95f3518b014a-utilities\") pod \"certified-operators-h2r86\" (UID: \"2171b86d-e053-4b6d-b456-95f3518b014a\") " pod="openshift-marketplace/certified-operators-h2r86" Nov 22 04:26:31 crc kubenswrapper[4952]: I1122 04:26:31.859209 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2171b86d-e053-4b6d-b456-95f3518b014a-utilities\") pod \"certified-operators-h2r86\" (UID: \"2171b86d-e053-4b6d-b456-95f3518b014a\") " pod="openshift-marketplace/certified-operators-h2r86" Nov 22 04:26:31 crc kubenswrapper[4952]: I1122 04:26:31.859422 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2171b86d-e053-4b6d-b456-95f3518b014a-catalog-content\") pod \"certified-operators-h2r86\" (UID: \"2171b86d-e053-4b6d-b456-95f3518b014a\") " pod="openshift-marketplace/certified-operators-h2r86" Nov 22 04:26:31 crc kubenswrapper[4952]: I1122 04:26:31.877423 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lzbm\" (UniqueName: \"kubernetes.io/projected/2171b86d-e053-4b6d-b456-95f3518b014a-kube-api-access-5lzbm\") pod \"certified-operators-h2r86\" (UID: \"2171b86d-e053-4b6d-b456-95f3518b014a\") " pod="openshift-marketplace/certified-operators-h2r86" Nov 22 04:26:31 crc kubenswrapper[4952]: I1122 04:26:31.985210 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h2r86" Nov 22 04:26:32 crc kubenswrapper[4952]: I1122 04:26:32.553743 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h2r86"] Nov 22 04:26:32 crc kubenswrapper[4952]: I1122 04:26:32.661143 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2r86" event={"ID":"2171b86d-e053-4b6d-b456-95f3518b014a","Type":"ContainerStarted","Data":"1b9c7c798324cd58c474210e8053324130ec7547a3441681f506576731771f29"} Nov 22 04:26:33 crc kubenswrapper[4952]: I1122 04:26:33.679217 4952 generic.go:334] "Generic (PLEG): container finished" podID="2171b86d-e053-4b6d-b456-95f3518b014a" containerID="e20f95171742d50ddc33849a0f61ad01e7e65a1038d41ca7df7e7e1efc61c062" exitCode=0 Nov 22 04:26:33 crc kubenswrapper[4952]: I1122 04:26:33.679299 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2r86" event={"ID":"2171b86d-e053-4b6d-b456-95f3518b014a","Type":"ContainerDied","Data":"e20f95171742d50ddc33849a0f61ad01e7e65a1038d41ca7df7e7e1efc61c062"} Nov 22 04:26:34 crc kubenswrapper[4952]: I1122 04:26:34.697069 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2r86" event={"ID":"2171b86d-e053-4b6d-b456-95f3518b014a","Type":"ContainerStarted","Data":"018dab2dcdc69649e1b5f224089666a91ff8f6ff96b0a1b262bb41e1b0a79a28"} Nov 22 04:26:35 crc kubenswrapper[4952]: I1122 04:26:35.706941 4952 generic.go:334] "Generic (PLEG): container finished" podID="2171b86d-e053-4b6d-b456-95f3518b014a" containerID="018dab2dcdc69649e1b5f224089666a91ff8f6ff96b0a1b262bb41e1b0a79a28" exitCode=0 Nov 22 04:26:35 crc kubenswrapper[4952]: I1122 04:26:35.706979 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2r86" event={"ID":"2171b86d-e053-4b6d-b456-95f3518b014a","Type":"ContainerDied","Data":"018dab2dcdc69649e1b5f224089666a91ff8f6ff96b0a1b262bb41e1b0a79a28"} Nov 22 04:26:36 crc kubenswrapper[4952]: I1122 04:26:36.732257 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2r86" event={"ID":"2171b86d-e053-4b6d-b456-95f3518b014a","Type":"ContainerStarted","Data":"0d1ef5944c12ab95adba97cfff3e65eeee3df336b82abdfcae7b4ead222590d6"} Nov 22 04:26:36 crc kubenswrapper[4952]: I1122 04:26:36.752376 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h2r86" podStartSLOduration=3.273130519 podStartE2EDuration="5.752361131s" podCreationTimestamp="2025-11-22 04:26:31 +0000 UTC" firstStartedPulling="2025-11-22 04:26:33.682629206 +0000 UTC m=+5557.988646519" lastFinishedPulling="2025-11-22 04:26:36.161859818 +0000 UTC m=+5560.467877131" observedRunningTime="2025-11-22 04:26:36.751610231 +0000 UTC m=+5561.057627514" watchObservedRunningTime="2025-11-22 04:26:36.752361131 +0000 UTC m=+5561.058378404" Nov 22 04:26:38 crc kubenswrapper[4952]: I1122 04:26:38.858903 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-746b4d95d8-xhmsq_0b0307be-c90c-45a1-bef7-32bf07c7e35e/barbican-api/0.log" Nov 22 04:26:38 crc kubenswrapper[4952]: I1122 04:26:38.913744 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-746b4d95d8-xhmsq_0b0307be-c90c-45a1-bef7-32bf07c7e35e/barbican-api-log/0.log" Nov 22 04:26:39 crc kubenswrapper[4952]: I1122 04:26:39.051889 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f9d695d86-pd2dh_19657c5b-43d6-45c6-802f-7fe7e6665f11/barbican-keystone-listener/0.log" Nov 22 04:26:39 crc kubenswrapper[4952]: I1122 04:26:39.211261 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5c47d667f9-d4njv_cf053aa9-0d2f-486e-b410-89fa0afebaad/barbican-worker/0.log" Nov 22 04:26:39 crc kubenswrapper[4952]: I1122 04:26:39.305857 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f9d695d86-pd2dh_19657c5b-43d6-45c6-802f-7fe7e6665f11/barbican-keystone-listener-log/0.log" Nov 22 04:26:39 crc kubenswrapper[4952]: I1122 04:26:39.314932 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5c47d667f9-d4njv_cf053aa9-0d2f-486e-b410-89fa0afebaad/barbican-worker-log/0.log" Nov 22 04:26:39 crc kubenswrapper[4952]: I1122 04:26:39.483473 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-hh6mq_10f9b191-e7da-494f-b29d-b0594d9044c2/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:26:39 crc kubenswrapper[4952]: I1122 04:26:39.538616 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ba66b462-c52b-4474-80c9-670bf6be8870/ceilometer-central-agent/1.log" Nov 22 04:26:39 crc kubenswrapper[4952]: I1122 04:26:39.617865 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ba66b462-c52b-4474-80c9-670bf6be8870/ceilometer-central-agent/0.log" Nov 22 04:26:39 crc kubenswrapper[4952]: I1122 04:26:39.664329 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ba66b462-c52b-4474-80c9-670bf6be8870/ceilometer-notification-agent/0.log" Nov 22 04:26:39 crc kubenswrapper[4952]: I1122 04:26:39.723379 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ba66b462-c52b-4474-80c9-670bf6be8870/proxy-httpd/0.log" Nov 22 04:26:39 crc kubenswrapper[4952]: I1122 04:26:39.765059 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ba66b462-c52b-4474-80c9-670bf6be8870/sg-core/0.log" Nov 22 04:26:39 crc kubenswrapper[4952]: I1122 04:26:39.890650 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-w8b52_11e2e39c-3f90-448f-8438-fb38763a3c03/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:26:39 crc kubenswrapper[4952]: I1122 04:26:39.958179 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-x2qhl_c20a61fd-51aa-46cb-9a9d-ffb8908dd2c4/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:26:40 crc kubenswrapper[4952]: I1122 04:26:40.500561 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e99cda79-b32c-4e09-8c24-9a4eb0c934ef/probe/0.log" Nov 22 04:26:40 crc kubenswrapper[4952]: I1122 04:26:40.707640 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_19fdfd30-4e69-4d96-951b-4daa193985e9/cinder-api/0.log" Nov 22 04:26:40 crc kubenswrapper[4952]: I1122 04:26:40.924373 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_19fdfd30-4e69-4d96-951b-4daa193985e9/cinder-api-log/0.log" Nov 22 04:26:40 crc kubenswrapper[4952]: I1122 04:26:40.959417 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e99cda79-b32c-4e09-8c24-9a4eb0c934ef/cinder-backup/0.log" Nov 22 04:26:40 crc kubenswrapper[4952]: I1122 04:26:40.977704 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2ff51385-a462-478d-bd61-62d15d7c5c41/cinder-scheduler/0.log" Nov 22 04:26:41 crc kubenswrapper[4952]: I1122 04:26:41.038655 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2ff51385-a462-478d-bd61-62d15d7c5c41/probe/0.log" Nov 22 04:26:41 crc kubenswrapper[4952]: I1122 04:26:41.191663 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_6dd872c8-ca07-4e06-9666-22d89916ead1/probe/0.log" Nov 22 04:26:41 crc kubenswrapper[4952]: I1122 04:26:41.427464 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-g64ph_4a56c774-0c51-4378-bfce-81bb5481f736/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:26:41 crc kubenswrapper[4952]: I1122 04:26:41.597868 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-gz55k_145d85a9-5de9-42d1-b463-d195f016e395/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:26:41 crc kubenswrapper[4952]: I1122 04:26:41.754579 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-wzpbj_91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7/init/0.log" Nov 22 04:26:41 crc kubenswrapper[4952]: I1122 04:26:41.986394 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h2r86" Nov 22 04:26:41 crc kubenswrapper[4952]: I1122 04:26:41.986423 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h2r86" Nov 22 04:26:42 crc kubenswrapper[4952]: I1122 04:26:42.026163 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-wzpbj_91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7/init/0.log" Nov 22 04:26:42 crc kubenswrapper[4952]: I1122 04:26:42.040641 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h2r86" Nov 22 04:26:42 crc kubenswrapper[4952]: I1122 04:26:42.096926 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-wzpbj_91b94dc0-c7d3-4bc0-8e87-fb387ed6d9a7/dnsmasq-dns/0.log" Nov 22 04:26:42 crc kubenswrapper[4952]: I1122 04:26:42.302022 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_72dcd5f8-1d52-42ab-a481-313bf4f5148d/glance-log/0.log" Nov 22 04:26:42 crc kubenswrapper[4952]: I1122 04:26:42.533628 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_72dcd5f8-1d52-42ab-a481-313bf4f5148d/glance-httpd/0.log" Nov 22 04:26:42 crc kubenswrapper[4952]: I1122 04:26:42.550375 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_39989530-1e90-4cb1-b7c5-681a9bdb322b/glance-httpd/0.log" Nov 22 04:26:42 crc kubenswrapper[4952]: I1122 04:26:42.558086 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_39989530-1e90-4cb1-b7c5-681a9bdb322b/glance-log/0.log" Nov 22 04:26:42 crc kubenswrapper[4952]: I1122 04:26:42.830919 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h2r86" Nov 22 04:26:42 crc kubenswrapper[4952]: I1122 04:26:42.877356 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h2r86"] Nov 22 04:26:42 crc kubenswrapper[4952]: I1122 04:26:42.890657 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-krzsg_70025cca-7c2a-4798-a3ab-5f58dd05033c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:26:43 crc kubenswrapper[4952]: I1122 04:26:43.000504 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-56848f9f44-m7z42_ea3db97f-72b6-4eaa-b8ea-256a5691008f/horizon/0.log" Nov 22 04:26:43 crc kubenswrapper[4952]: I1122 04:26:43.087220 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-9x9cm_558d097a-8399-4b38-b883-f28c31b108a3/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:26:43 crc kubenswrapper[4952]: I1122 04:26:43.321075 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-56848f9f44-m7z42_ea3db97f-72b6-4eaa-b8ea-256a5691008f/horizon-log/0.log" Nov 22 04:26:43 crc kubenswrapper[4952]: I1122 04:26:43.441742 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29396401-wntch_861a0695-d514-4617-9720-062db08dbae7/keystone-cron/0.log" Nov 22 04:26:43 crc kubenswrapper[4952]: I1122 04:26:43.569377 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_5c8e5784-03cd-4a8a-9859-afe728764282/kube-state-metrics/0.log" Nov 22 04:26:43 crc kubenswrapper[4952]: I1122 04:26:43.811575 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-wnl5c_4f2a731f-8431-4769-a78e-6522954dd7b5/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:26:44 crc kubenswrapper[4952]: I1122 04:26:44.002178 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_7aa384b1-1586-4709-b258-def203cac8f5/manila-api-log/0.log" Nov 22 04:26:44 crc kubenswrapper[4952]: I1122 04:26:44.098250 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_7aa384b1-1586-4709-b258-def203cac8f5/manila-api/0.log" Nov 22 04:26:44 crc kubenswrapper[4952]: I1122 04:26:44.262004 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-65f67fb964-vlq9s_28d23392-8bab-45b0-a64b-a440b2850703/keystone-api/0.log" Nov 22 04:26:44 crc kubenswrapper[4952]: I1122 04:26:44.312457 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_21fe7ebd-1970-410b-948e-1837b5d6295b/probe/0.log" Nov 22 04:26:44 crc kubenswrapper[4952]: I1122 04:26:44.317293 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_21fe7ebd-1970-410b-948e-1837b5d6295b/manila-scheduler/0.log" Nov 22 04:26:44 crc kubenswrapper[4952]: I1122 04:26:44.480181 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_02064695-893b-4036-83b5-17863ffb7028/manila-share/0.log" Nov 22 04:26:44 crc kubenswrapper[4952]: I1122 04:26:44.575386 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_02064695-893b-4036-83b5-17863ffb7028/probe/0.log" Nov 22 04:26:44 crc kubenswrapper[4952]: I1122 04:26:44.795298 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h2r86" podUID="2171b86d-e053-4b6d-b456-95f3518b014a" containerName="registry-server" containerID="cri-o://0d1ef5944c12ab95adba97cfff3e65eeee3df336b82abdfcae7b4ead222590d6" gracePeriod=2 Nov 22 04:26:44 crc kubenswrapper[4952]: I1122 04:26:44.982725 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-85576cd755-lg8j8_b5998a45-a2cf-4155-b31c-7c39c5768ec1/neutron-httpd/0.log" Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.156293 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-25rv5_fed80aaa-96fc-489e-b8fa-b25d9dff2f51/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.273607 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h2r86" Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.282063 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-85576cd755-lg8j8_b5998a45-a2cf-4155-b31c-7c39c5768ec1/neutron-api/0.log" Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.421334 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lzbm\" (UniqueName: \"kubernetes.io/projected/2171b86d-e053-4b6d-b456-95f3518b014a-kube-api-access-5lzbm\") pod \"2171b86d-e053-4b6d-b456-95f3518b014a\" (UID: \"2171b86d-e053-4b6d-b456-95f3518b014a\") " Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.421523 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2171b86d-e053-4b6d-b456-95f3518b014a-utilities\") pod \"2171b86d-e053-4b6d-b456-95f3518b014a\" (UID: \"2171b86d-e053-4b6d-b456-95f3518b014a\") " Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.421692 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2171b86d-e053-4b6d-b456-95f3518b014a-catalog-content\") pod \"2171b86d-e053-4b6d-b456-95f3518b014a\" (UID: \"2171b86d-e053-4b6d-b456-95f3518b014a\") " Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.422298 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2171b86d-e053-4b6d-b456-95f3518b014a-utilities" (OuterVolumeSpecName: "utilities") pod "2171b86d-e053-4b6d-b456-95f3518b014a" (UID: "2171b86d-e053-4b6d-b456-95f3518b014a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.436760 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2171b86d-e053-4b6d-b456-95f3518b014a-kube-api-access-5lzbm" (OuterVolumeSpecName: "kube-api-access-5lzbm") pod "2171b86d-e053-4b6d-b456-95f3518b014a" (UID: "2171b86d-e053-4b6d-b456-95f3518b014a"). InnerVolumeSpecName "kube-api-access-5lzbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.461905 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2171b86d-e053-4b6d-b456-95f3518b014a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2171b86d-e053-4b6d-b456-95f3518b014a" (UID: "2171b86d-e053-4b6d-b456-95f3518b014a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.523515 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2171b86d-e053-4b6d-b456-95f3518b014a-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.523558 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2171b86d-e053-4b6d-b456-95f3518b014a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.523569 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lzbm\" (UniqueName: \"kubernetes.io/projected/2171b86d-e053-4b6d-b456-95f3518b014a-kube-api-access-5lzbm\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.806804 4952 generic.go:334] "Generic (PLEG): container finished" podID="2171b86d-e053-4b6d-b456-95f3518b014a" containerID="0d1ef5944c12ab95adba97cfff3e65eeee3df336b82abdfcae7b4ead222590d6" exitCode=0 Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.806850 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2r86" event={"ID":"2171b86d-e053-4b6d-b456-95f3518b014a","Type":"ContainerDied","Data":"0d1ef5944c12ab95adba97cfff3e65eeee3df336b82abdfcae7b4ead222590d6"} Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.806875 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2r86" event={"ID":"2171b86d-e053-4b6d-b456-95f3518b014a","Type":"ContainerDied","Data":"1b9c7c798324cd58c474210e8053324130ec7547a3441681f506576731771f29"} Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.806890 4952 scope.go:117] "RemoveContainer" containerID="0d1ef5944c12ab95adba97cfff3e65eeee3df336b82abdfcae7b4ead222590d6" Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.807029 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h2r86" Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.833913 4952 scope.go:117] "RemoveContainer" containerID="018dab2dcdc69649e1b5f224089666a91ff8f6ff96b0a1b262bb41e1b0a79a28" Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.862311 4952 scope.go:117] "RemoveContainer" containerID="e20f95171742d50ddc33849a0f61ad01e7e65a1038d41ca7df7e7e1efc61c062" Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.883525 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h2r86"] Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.899221 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h2r86"] Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.915068 4952 scope.go:117] "RemoveContainer" containerID="0d1ef5944c12ab95adba97cfff3e65eeee3df336b82abdfcae7b4ead222590d6" Nov 22 04:26:45 crc kubenswrapper[4952]: E1122 04:26:45.915596 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d1ef5944c12ab95adba97cfff3e65eeee3df336b82abdfcae7b4ead222590d6\": container with ID starting with 0d1ef5944c12ab95adba97cfff3e65eeee3df336b82abdfcae7b4ead222590d6 not found: ID does not exist" containerID="0d1ef5944c12ab95adba97cfff3e65eeee3df336b82abdfcae7b4ead222590d6" Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.915628 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d1ef5944c12ab95adba97cfff3e65eeee3df336b82abdfcae7b4ead222590d6"} err="failed to get container status \"0d1ef5944c12ab95adba97cfff3e65eeee3df336b82abdfcae7b4ead222590d6\": rpc error: code = NotFound desc = could not find container \"0d1ef5944c12ab95adba97cfff3e65eeee3df336b82abdfcae7b4ead222590d6\": container with ID starting with 0d1ef5944c12ab95adba97cfff3e65eeee3df336b82abdfcae7b4ead222590d6 not found: ID does not exist" Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.915653 4952 scope.go:117] "RemoveContainer" containerID="018dab2dcdc69649e1b5f224089666a91ff8f6ff96b0a1b262bb41e1b0a79a28" Nov 22 04:26:45 crc kubenswrapper[4952]: E1122 04:26:45.916007 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"018dab2dcdc69649e1b5f224089666a91ff8f6ff96b0a1b262bb41e1b0a79a28\": container with ID starting with 018dab2dcdc69649e1b5f224089666a91ff8f6ff96b0a1b262bb41e1b0a79a28 not found: ID does not exist" containerID="018dab2dcdc69649e1b5f224089666a91ff8f6ff96b0a1b262bb41e1b0a79a28" Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.916167 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"018dab2dcdc69649e1b5f224089666a91ff8f6ff96b0a1b262bb41e1b0a79a28"} err="failed to get container status \"018dab2dcdc69649e1b5f224089666a91ff8f6ff96b0a1b262bb41e1b0a79a28\": rpc error: code = NotFound desc = could not find container \"018dab2dcdc69649e1b5f224089666a91ff8f6ff96b0a1b262bb41e1b0a79a28\": container with ID starting with 018dab2dcdc69649e1b5f224089666a91ff8f6ff96b0a1b262bb41e1b0a79a28 not found: ID does not exist" Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.916192 4952 scope.go:117] "RemoveContainer" containerID="e20f95171742d50ddc33849a0f61ad01e7e65a1038d41ca7df7e7e1efc61c062" Nov 22 04:26:45 crc kubenswrapper[4952]: E1122 04:26:45.916439 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e20f95171742d50ddc33849a0f61ad01e7e65a1038d41ca7df7e7e1efc61c062\": container with ID starting with e20f95171742d50ddc33849a0f61ad01e7e65a1038d41ca7df7e7e1efc61c062 not found: ID does not exist" containerID="e20f95171742d50ddc33849a0f61ad01e7e65a1038d41ca7df7e7e1efc61c062" Nov 22 04:26:45 crc kubenswrapper[4952]: I1122 04:26:45.916491 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20f95171742d50ddc33849a0f61ad01e7e65a1038d41ca7df7e7e1efc61c062"} err="failed to get container status \"e20f95171742d50ddc33849a0f61ad01e7e65a1038d41ca7df7e7e1efc61c062\": rpc error: code = NotFound desc = could not find container \"e20f95171742d50ddc33849a0f61ad01e7e65a1038d41ca7df7e7e1efc61c062\": container with ID starting with e20f95171742d50ddc33849a0f61ad01e7e65a1038d41ca7df7e7e1efc61c062 not found: ID does not exist" Nov 22 04:26:46 crc kubenswrapper[4952]: I1122 04:26:46.034015 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_1ed8df98-1a16-4c89-b60b-c3589ec701be/nova-cell0-conductor-conductor/0.log" Nov 22 04:26:46 crc kubenswrapper[4952]: I1122 04:26:46.321432 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_66852029-394b-4d8f-9104-709ac254def2/nova-api-log/0.log" Nov 22 04:26:46 crc kubenswrapper[4952]: I1122 04:26:46.548738 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2171b86d-e053-4b6d-b456-95f3518b014a" path="/var/lib/kubelet/pods/2171b86d-e053-4b6d-b456-95f3518b014a/volumes" Nov 22 04:26:46 crc kubenswrapper[4952]: I1122 04:26:46.685443 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_cdf38f2c-d0ed-4724-b542-fe296d6d6466/nova-cell1-conductor-conductor/0.log" Nov 22 04:26:46 crc kubenswrapper[4952]: I1122 04:26:46.843508 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_66852029-394b-4d8f-9104-709ac254def2/nova-api-api/0.log" Nov 22 04:26:46 crc kubenswrapper[4952]: I1122 04:26:46.922290 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_b5a83aed-143b-40e4-a06a-7452102935c2/nova-cell1-novncproxy-novncproxy/0.log" Nov 22 04:26:47 crc kubenswrapper[4952]: I1122 04:26:47.043455 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_169f3305-945d-47b9-8764-6e37ee8863e0/memcached/0.log" Nov 22 04:26:47 crc kubenswrapper[4952]: I1122 04:26:47.093133 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-pjjmz_80e23ef1-5ca7-4b59-a4a7-17586e0a1989/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:26:47 crc kubenswrapper[4952]: I1122 04:26:47.233694 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86/nova-metadata-log/0.log" Nov 22 04:26:47 crc kubenswrapper[4952]: I1122 04:26:47.553459 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0f098b30-65ee-4b19-9674-c384bddf0832/mysql-bootstrap/0.log" Nov 22 04:26:47 crc kubenswrapper[4952]: I1122 04:26:47.691995 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_f34eb03c-9da3-48a6-8b9d-1c507aa538d2/nova-scheduler-scheduler/0.log" Nov 22 04:26:47 crc kubenswrapper[4952]: I1122 04:26:47.707601 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0f098b30-65ee-4b19-9674-c384bddf0832/mysql-bootstrap/0.log" Nov 22 04:26:47 crc kubenswrapper[4952]: I1122 04:26:47.766714 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0f098b30-65ee-4b19-9674-c384bddf0832/galera/0.log" Nov 22 04:26:47 crc kubenswrapper[4952]: I1122 04:26:47.990736 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9a3eb772-8262-4b28-873f-63f00885054d/mysql-bootstrap/0.log" Nov 22 04:26:48 crc kubenswrapper[4952]: I1122 04:26:48.164168 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9a3eb772-8262-4b28-873f-63f00885054d/mysql-bootstrap/0.log" Nov 22 04:26:48 crc kubenswrapper[4952]: I1122 04:26:48.254280 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9a3eb772-8262-4b28-873f-63f00885054d/galera/0.log" Nov 22 04:26:48 crc kubenswrapper[4952]: I1122 04:26:48.563517 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_4768e499-4f1e-4a22-9e20-b05cf83f1c89/openstackclient/0.log" Nov 22 04:26:48 crc kubenswrapper[4952]: I1122 04:26:48.658201 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9kcvz_82bf89be-221b-4963-bfa4-794e0eb978c6/ovn-controller/0.log" Nov 22 04:26:48 crc kubenswrapper[4952]: I1122 04:26:48.851570 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-74dm2_2610ac31-191b-4b34-8b6c-2362a88d2e40/openstack-network-exporter/0.log" Nov 22 04:26:49 crc kubenswrapper[4952]: I1122 04:26:49.063089 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-blvps_36a574de-e2a6-4711-82ae-a7ffc34ef5fd/ovsdb-server-init/0.log" Nov 22 04:26:49 crc kubenswrapper[4952]: I1122 04:26:49.197007 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-blvps_36a574de-e2a6-4711-82ae-a7ffc34ef5fd/ovsdb-server-init/0.log" Nov 22 04:26:49 crc kubenswrapper[4952]: I1122 04:26:49.219427 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-blvps_36a574de-e2a6-4711-82ae-a7ffc34ef5fd/ovs-vswitchd/0.log" Nov 22 04:26:49 crc kubenswrapper[4952]: I1122 04:26:49.255053 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-blvps_36a574de-e2a6-4711-82ae-a7ffc34ef5fd/ovsdb-server/0.log" Nov 22 04:26:49 crc kubenswrapper[4952]: I1122 04:26:49.494655 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fvbt6_99bc1860-8d74-4e04-ba87-35b5254e7a57/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:26:49 crc kubenswrapper[4952]: I1122 04:26:49.634639 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fc722b47-abbf-470b-8e75-be1c5208c604/openstack-network-exporter/0.log" Nov 22 04:26:49 crc kubenswrapper[4952]: I1122 04:26:49.683092 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7583e9ac-c3b8-4d9d-89f4-0732dd0e6e86/nova-metadata-metadata/0.log" Nov 22 04:26:49 crc kubenswrapper[4952]: I1122 04:26:49.687755 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fc722b47-abbf-470b-8e75-be1c5208c604/ovn-northd/0.log" Nov 22 04:26:49 crc kubenswrapper[4952]: I1122 04:26:49.811577 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dbd8adf1-9949-4500-83d9-dbcb3d42037f/openstack-network-exporter/0.log" Nov 22 04:26:49 crc kubenswrapper[4952]: I1122 04:26:49.848805 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dbd8adf1-9949-4500-83d9-dbcb3d42037f/ovsdbserver-nb/0.log" Nov 22 04:26:49 crc kubenswrapper[4952]: I1122 04:26:49.954484 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_da62937c-7359-4ba3-bf95-ddc5545df677/openstack-network-exporter/0.log" Nov 22 04:26:49 crc kubenswrapper[4952]: I1122 04:26:49.986012 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_da62937c-7359-4ba3-bf95-ddc5545df677/ovsdbserver-sb/0.log" Nov 22 04:26:50 crc kubenswrapper[4952]: I1122 04:26:50.098189 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_6dd872c8-ca07-4e06-9666-22d89916ead1/cinder-volume/0.log" Nov 22 04:26:50 crc kubenswrapper[4952]: I1122 04:26:50.265237 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-64f5d57fc8-zqkpt_d787583c-f83b-4ced-aee7-b3c9c22217a7/placement-log/0.log" Nov 22 04:26:50 crc kubenswrapper[4952]: I1122 04:26:50.279482 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8f72c2a8-8441-4469-a7aa-d87b27a7dd6a/setup-container/0.log" Nov 22 04:26:50 crc kubenswrapper[4952]: I1122 04:26:50.284685 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-64f5d57fc8-zqkpt_d787583c-f83b-4ced-aee7-b3c9c22217a7/placement-api/0.log" Nov 22 04:26:50 crc kubenswrapper[4952]: I1122 04:26:50.456816 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8f72c2a8-8441-4469-a7aa-d87b27a7dd6a/rabbitmq/0.log" Nov 22 04:26:50 crc kubenswrapper[4952]: I1122 04:26:50.458637 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f71324dd-d6ac-457c-83be-541d1afa5ec4/setup-container/0.log" Nov 22 04:26:50 crc kubenswrapper[4952]: I1122 04:26:50.460891 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8f72c2a8-8441-4469-a7aa-d87b27a7dd6a/setup-container/0.log" Nov 22 04:26:50 crc kubenswrapper[4952]: I1122 04:26:50.695820 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f71324dd-d6ac-457c-83be-541d1afa5ec4/setup-container/0.log" Nov 22 04:26:50 crc kubenswrapper[4952]: I1122 04:26:50.700881 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f71324dd-d6ac-457c-83be-541d1afa5ec4/rabbitmq/0.log" Nov 22 04:26:50 crc kubenswrapper[4952]: I1122 04:26:50.716396 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-kl7nz_6908216d-a851-4626-a77f-c24f71c10f97/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:26:50 crc kubenswrapper[4952]: I1122 04:26:50.893991 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rrshs_b951af39-2eb8-430d-933b-121f858c322c/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:26:50 crc kubenswrapper[4952]: I1122 04:26:50.915108 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-s79b7_36289a09-ac23-452c-b88f-9eba30618fe3/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:26:50 crc kubenswrapper[4952]: I1122 04:26:50.926099 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-p9xll_87882231-d670-44b0-bd0f-9d637b9ddc98/ssh-known-hosts-edpm-deployment/0.log" Nov 22 04:26:51 crc kubenswrapper[4952]: I1122 04:26:51.123399 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e4eb377f-8d5f-44e6-b719-2374703359e3/test-operator-logs-container/0.log" Nov 22 04:26:51 crc kubenswrapper[4952]: I1122 04:26:51.151585 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_b32e0459-7aee-4841-8281-da334fe3e8d8/tempest-tests-tempest-tests-runner/0.log" Nov 22 04:26:51 crc kubenswrapper[4952]: I1122 04:26:51.293282 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-w7f7v_89b44b40-f505-48de-ada1-e476204fd059/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:27:04 crc kubenswrapper[4952]: I1122 04:27:04.547586 4952 scope.go:117] "RemoveContainer" containerID="822dff6070c047ad0ec3230d88ed17b1d155ee3de1deb39ca837ef381a7c85c6" Nov 22 04:27:04 crc kubenswrapper[4952]: I1122 04:27:04.582843 4952 scope.go:117] "RemoveContainer" containerID="77573c644c1c8b807123120bfadc4036b5ab6ada66c8ae7b2f8b9249a04818d6" Nov 22 04:27:04 crc kubenswrapper[4952]: I1122 04:27:04.648123 4952 scope.go:117] "RemoveContainer" containerID="a8f73844df760520ad1fca25334a760c60380d69cb1d4dea880109601a91139d" Nov 22 04:27:12 crc kubenswrapper[4952]: I1122 04:27:12.713011 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-75fb479bcc-j9bj9_3166414a-5d0f-460b-81cd-a8cfab489ff5/kube-rbac-proxy/0.log" Nov 22 04:27:12 crc kubenswrapper[4952]: I1122 04:27:12.742236 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-75fb479bcc-j9bj9_3166414a-5d0f-460b-81cd-a8cfab489ff5/manager/0.log" Nov 22 04:27:12 crc kubenswrapper[4952]: I1122 04:27:12.915841 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-fdcbbd9b5-w9lwq_2704d952-0ecf-42b9-9185-625d8c662a00/kube-rbac-proxy/0.log" Nov 22 04:27:12 crc kubenswrapper[4952]: I1122 04:27:12.982822 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-fdcbbd9b5-w9lwq_2704d952-0ecf-42b9-9185-625d8c662a00/manager/0.log" Nov 22 04:27:13 crc kubenswrapper[4952]: I1122 04:27:13.036959 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl_87e6d798-15ee-44f5-9e53-6bd17839c89d/util/0.log" Nov 22 04:27:13 crc kubenswrapper[4952]: I1122 04:27:13.255177 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl_87e6d798-15ee-44f5-9e53-6bd17839c89d/util/0.log" Nov 22 04:27:13 crc kubenswrapper[4952]: I1122 04:27:13.291769 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl_87e6d798-15ee-44f5-9e53-6bd17839c89d/pull/0.log" Nov 22 04:27:13 crc kubenswrapper[4952]: I1122 04:27:13.302343 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl_87e6d798-15ee-44f5-9e53-6bd17839c89d/pull/0.log" Nov 22 04:27:13 crc kubenswrapper[4952]: I1122 04:27:13.468975 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl_87e6d798-15ee-44f5-9e53-6bd17839c89d/extract/0.log" Nov 22 04:27:13 crc kubenswrapper[4952]: I1122 04:27:13.496843 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl_87e6d798-15ee-44f5-9e53-6bd17839c89d/pull/0.log" Nov 22 04:27:13 crc kubenswrapper[4952]: I1122 04:27:13.509262 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d00cd71f6aed78abab6fc06dcd9036e7eaa639e20a6aef60bb6a1b1a82ppkrl_87e6d798-15ee-44f5-9e53-6bd17839c89d/util/0.log" Nov 22 04:27:13 crc kubenswrapper[4952]: I1122 04:27:13.678065 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-767ccfd65f-t2mlk_dbb212fa-2d69-47c9-8ddc-3f1d78ce745a/kube-rbac-proxy/0.log" Nov 22 04:27:13 crc kubenswrapper[4952]: I1122 04:27:13.720127 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-767ccfd65f-t2mlk_dbb212fa-2d69-47c9-8ddc-3f1d78ce745a/manager/0.log" Nov 22 04:27:13 crc kubenswrapper[4952]: I1122 04:27:13.744677 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7969689c84-lhbkq_996261d7-26a3-41f8-9531-73c3ec296c1d/kube-rbac-proxy/0.log" Nov 22 04:27:13 crc kubenswrapper[4952]: I1122 04:27:13.918792 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7969689c84-lhbkq_996261d7-26a3-41f8-9531-73c3ec296c1d/manager/0.log" Nov 22 04:27:13 crc kubenswrapper[4952]: I1122 04:27:13.926345 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-56f54d6746-29fxl_0b8276cf-2a3d-40ad-83c6-fa522270b8a7/kube-rbac-proxy/0.log" Nov 22 04:27:13 crc kubenswrapper[4952]: I1122 04:27:13.929618 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-56f54d6746-29fxl_0b8276cf-2a3d-40ad-83c6-fa522270b8a7/manager/0.log" Nov 22 04:27:14 crc kubenswrapper[4952]: I1122 04:27:14.085498 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-598f69df5d-str4f_36ba8aa7-85ec-461d-a0d7-39f09c60289f/kube-rbac-proxy/0.log" Nov 22 04:27:14 crc kubenswrapper[4952]: I1122 04:27:14.117705 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-598f69df5d-str4f_36ba8aa7-85ec-461d-a0d7-39f09c60289f/manager/0.log" Nov 22 04:27:14 crc kubenswrapper[4952]: I1122 04:27:14.260651 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7875d8bb94-2vq4f_e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b/kube-rbac-proxy/0.log" Nov 22 04:27:14 crc kubenswrapper[4952]: I1122 04:27:14.401784 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7875d8bb94-2vq4f_e3c0cb40-214c-4e4c-a7d8-3f6b47b9715b/manager/0.log" Nov 22 04:27:14 crc kubenswrapper[4952]: I1122 04:27:14.406280 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-99b499f4-g7gzn_3de98aa0-3a15-4c46-adf3-d1715ccf5274/kube-rbac-proxy/0.log" Nov 22 04:27:14 crc kubenswrapper[4952]: I1122 04:27:14.453036 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-99b499f4-g7gzn_3de98aa0-3a15-4c46-adf3-d1715ccf5274/manager/0.log" Nov 22 04:27:14 crc kubenswrapper[4952]: I1122 04:27:14.574575 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7454b96578-8d2db_e8259b74-ce3d-4875-ac32-71e4397b4c01/kube-rbac-proxy/0.log" Nov 22 04:27:14 crc kubenswrapper[4952]: I1122 04:27:14.717883 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7454b96578-8d2db_e8259b74-ce3d-4875-ac32-71e4397b4c01/manager/0.log" Nov 22 04:27:14 crc kubenswrapper[4952]: I1122 04:27:14.801622 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58f887965d-dkhsn_6523048e-98da-4e64-9c79-7bbeda6ea361/kube-rbac-proxy/0.log" Nov 22 04:27:14 crc kubenswrapper[4952]: I1122 04:27:14.885286 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-54b5986bb8-gr8vb_032f6fc1-7bde-422b-bbe0-d83027f069d0/kube-rbac-proxy/0.log" Nov 22 04:27:14 crc kubenswrapper[4952]: I1122 04:27:14.946304 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58f887965d-dkhsn_6523048e-98da-4e64-9c79-7bbeda6ea361/manager/0.log" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.076382 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-54b5986bb8-gr8vb_032f6fc1-7bde-422b-bbe0-d83027f069d0/manager/0.log" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.090580 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-78bd47f458-qw49c_887a5b13-80a9-4d9c-8b14-6768805ca936/kube-rbac-proxy/0.log" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.200351 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-78bd47f458-qw49c_887a5b13-80a9-4d9c-8b14-6768805ca936/manager/0.log" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.307973 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-cfbb9c588-ztqdk_c866a446-9629-4d59-8953-0599bda45549/kube-rbac-proxy/0.log" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.363964 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-cfbb9c588-ztqdk_c866a446-9629-4d59-8953-0599bda45549/manager/0.log" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.480163 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-54cfbf4c7d-68bjm_d44a318e-4d58-4719-a567-6d849321b946/kube-rbac-proxy/0.log" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.575369 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-54cfbf4c7d-68bjm_d44a318e-4d58-4719-a567-6d849321b946/manager/0.log" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.620759 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9_d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5/kube-rbac-proxy/0.log" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.693234 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rktgw"] Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.693453 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-8c7444f48-4mcn9_d324f1a8-46a5-4bb7-b84b-5f42a6a2c5d5/manager/0.log" Nov 22 04:27:15 crc kubenswrapper[4952]: E1122 04:27:15.698982 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2171b86d-e053-4b6d-b456-95f3518b014a" containerName="extract-utilities" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.699017 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="2171b86d-e053-4b6d-b456-95f3518b014a" containerName="extract-utilities" Nov 22 04:27:15 crc kubenswrapper[4952]: E1122 04:27:15.699037 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2171b86d-e053-4b6d-b456-95f3518b014a" containerName="extract-content" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.699047 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="2171b86d-e053-4b6d-b456-95f3518b014a" containerName="extract-content" Nov 22 04:27:15 crc kubenswrapper[4952]: E1122 04:27:15.699064 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2171b86d-e053-4b6d-b456-95f3518b014a" containerName="registry-server" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.699073 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="2171b86d-e053-4b6d-b456-95f3518b014a" containerName="registry-server" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.699435 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="2171b86d-e053-4b6d-b456-95f3518b014a" containerName="registry-server" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.701243 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rktgw" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.710908 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rktgw"] Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.754568 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxc8h\" (UniqueName: \"kubernetes.io/projected/4bfe9d56-246b-409d-9395-20d88c8274d8-kube-api-access-mxc8h\") pod \"community-operators-rktgw\" (UID: \"4bfe9d56-246b-409d-9395-20d88c8274d8\") " pod="openshift-marketplace/community-operators-rktgw" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.754617 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bfe9d56-246b-409d-9395-20d88c8274d8-catalog-content\") pod \"community-operators-rktgw\" (UID: \"4bfe9d56-246b-409d-9395-20d88c8274d8\") " pod="openshift-marketplace/community-operators-rktgw" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.754647 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bfe9d56-246b-409d-9395-20d88c8274d8-utilities\") pod \"community-operators-rktgw\" (UID: \"4bfe9d56-246b-409d-9395-20d88c8274d8\") " pod="openshift-marketplace/community-operators-rktgw" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.856653 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bfe9d56-246b-409d-9395-20d88c8274d8-catalog-content\") pod \"community-operators-rktgw\" (UID: \"4bfe9d56-246b-409d-9395-20d88c8274d8\") " pod="openshift-marketplace/community-operators-rktgw" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.856707 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bfe9d56-246b-409d-9395-20d88c8274d8-utilities\") pod \"community-operators-rktgw\" (UID: \"4bfe9d56-246b-409d-9395-20d88c8274d8\") " pod="openshift-marketplace/community-operators-rktgw" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.856858 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxc8h\" (UniqueName: \"kubernetes.io/projected/4bfe9d56-246b-409d-9395-20d88c8274d8-kube-api-access-mxc8h\") pod \"community-operators-rktgw\" (UID: \"4bfe9d56-246b-409d-9395-20d88c8274d8\") " pod="openshift-marketplace/community-operators-rktgw" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.857576 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bfe9d56-246b-409d-9395-20d88c8274d8-catalog-content\") pod \"community-operators-rktgw\" (UID: \"4bfe9d56-246b-409d-9395-20d88c8274d8\") " pod="openshift-marketplace/community-operators-rktgw" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.857787 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bfe9d56-246b-409d-9395-20d88c8274d8-utilities\") pod \"community-operators-rktgw\" (UID: \"4bfe9d56-246b-409d-9395-20d88c8274d8\") " pod="openshift-marketplace/community-operators-rktgw" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.881794 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxc8h\" (UniqueName: \"kubernetes.io/projected/4bfe9d56-246b-409d-9395-20d88c8274d8-kube-api-access-mxc8h\") pod \"community-operators-rktgw\" (UID: \"4bfe9d56-246b-409d-9395-20d88c8274d8\") " pod="openshift-marketplace/community-operators-rktgw" Nov 22 04:27:15 crc kubenswrapper[4952]: I1122 04:27:15.943289 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-66b68cc995-2h722_ff89dfa5-e056-4753-99a0-4aad073a1734/kube-rbac-proxy/0.log" Nov 22 04:27:16 crc kubenswrapper[4952]: I1122 04:27:16.077886 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rktgw" Nov 22 04:27:16 crc kubenswrapper[4952]: I1122 04:27:16.530144 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-66c8cfd656-bvmf8_79375e72-ee47-4b49-95aa-bcf8e211145f/kube-rbac-proxy/0.log" Nov 22 04:27:16 crc kubenswrapper[4952]: I1122 04:27:16.668059 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rktgw"] Nov 22 04:27:16 crc kubenswrapper[4952]: I1122 04:27:16.701064 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-xbtqc_37332314-a4d3-4c04-a480-561c80a2fa8a/registry-server/0.log" Nov 22 04:27:16 crc kubenswrapper[4952]: I1122 04:27:16.810169 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-66c8cfd656-bvmf8_79375e72-ee47-4b49-95aa-bcf8e211145f/operator/0.log" Nov 22 04:27:16 crc kubenswrapper[4952]: I1122 04:27:16.938318 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-54fc5f65b7-hfx5m_e71d0c55-c142-4a8d-8677-accf9858de48/kube-rbac-proxy/0.log" Nov 22 04:27:16 crc kubenswrapper[4952]: I1122 04:27:16.951941 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-54fc5f65b7-hfx5m_e71d0c55-c142-4a8d-8677-accf9858de48/manager/0.log" Nov 22 04:27:17 crc kubenswrapper[4952]: I1122 04:27:17.064043 4952 generic.go:334] "Generic (PLEG): container finished" podID="4bfe9d56-246b-409d-9395-20d88c8274d8" containerID="0fe34b68c464cc3bc57a75c9f1e3c64ec82f341b1501a498288ba830f57359a0" exitCode=0 Nov 22 04:27:17 crc kubenswrapper[4952]: I1122 04:27:17.064090 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rktgw" event={"ID":"4bfe9d56-246b-409d-9395-20d88c8274d8","Type":"ContainerDied","Data":"0fe34b68c464cc3bc57a75c9f1e3c64ec82f341b1501a498288ba830f57359a0"} Nov 22 04:27:17 crc kubenswrapper[4952]: I1122 04:27:17.064120 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rktgw" event={"ID":"4bfe9d56-246b-409d-9395-20d88c8274d8","Type":"ContainerStarted","Data":"b9e732241156b3e9afd869ab8a93a3747bbd201d1b163c3aaa61a7b1ae973248"} Nov 22 04:27:17 crc kubenswrapper[4952]: I1122 04:27:17.090741 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b797b8dff-wnb2k_078fa30e-9d17-4ca9-911a-43a0376ffe8f/kube-rbac-proxy/0.log" Nov 22 04:27:17 crc kubenswrapper[4952]: I1122 04:27:17.228290 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-66b68cc995-2h722_ff89dfa5-e056-4753-99a0-4aad073a1734/manager/0.log" Nov 22 04:27:17 crc kubenswrapper[4952]: I1122 04:27:17.230344 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b797b8dff-wnb2k_078fa30e-9d17-4ca9-911a-43a0376ffe8f/manager/0.log" Nov 22 04:27:17 crc kubenswrapper[4952]: I1122 04:27:17.242451 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-r47mr_343962a9-648e-4dd5-a813-730ef99a136e/operator/0.log" Nov 22 04:27:17 crc kubenswrapper[4952]: I1122 04:27:17.409456 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d656998f4-d9k2w_5cbee4eb-c188-4a47-9d37-4de16bd79f07/manager/0.log" Nov 22 04:27:17 crc kubenswrapper[4952]: I1122 04:27:17.411539 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d656998f4-d9k2w_5cbee4eb-c188-4a47-9d37-4de16bd79f07/kube-rbac-proxy/0.log" Nov 22 04:27:17 crc kubenswrapper[4952]: I1122 04:27:17.440786 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d4bf84b58-vlsq4_87f47e69-902e-4a6a-a6d6-ce72f960a9e4/kube-rbac-proxy/0.log" Nov 22 04:27:17 crc kubenswrapper[4952]: I1122 04:27:17.565338 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d4bf84b58-vlsq4_87f47e69-902e-4a6a-a6d6-ce72f960a9e4/manager/0.log" Nov 22 04:27:17 crc kubenswrapper[4952]: I1122 04:27:17.614041 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-b4c496f69-hb768_6d83cc9e-ff79-489f-ab6f-f8072ddcdbf2/kube-rbac-proxy/0.log" Nov 22 04:27:17 crc kubenswrapper[4952]: I1122 04:27:17.690825 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-b4c496f69-hb768_6d83cc9e-ff79-489f-ab6f-f8072ddcdbf2/manager/0.log" Nov 22 04:27:17 crc kubenswrapper[4952]: I1122 04:27:17.805788 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-8c6448b9f-ftxgw_6791492d-51b4-4cd0-bbac-bd690baec76e/kube-rbac-proxy/0.log" Nov 22 04:27:17 crc kubenswrapper[4952]: I1122 04:27:17.843131 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-8c6448b9f-ftxgw_6791492d-51b4-4cd0-bbac-bd690baec76e/manager/0.log" Nov 22 04:27:18 crc kubenswrapper[4952]: I1122 04:27:18.073315 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rktgw" event={"ID":"4bfe9d56-246b-409d-9395-20d88c8274d8","Type":"ContainerStarted","Data":"ee06a5387cdc7643c1c6dccdcdc3abfa75a0c835ee657ef9bb4f10aad7244e56"} Nov 22 04:27:21 crc kubenswrapper[4952]: I1122 04:27:21.100827 4952 generic.go:334] "Generic (PLEG): container finished" podID="4bfe9d56-246b-409d-9395-20d88c8274d8" containerID="ee06a5387cdc7643c1c6dccdcdc3abfa75a0c835ee657ef9bb4f10aad7244e56" exitCode=0 Nov 22 04:27:21 crc kubenswrapper[4952]: I1122 04:27:21.100905 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rktgw" event={"ID":"4bfe9d56-246b-409d-9395-20d88c8274d8","Type":"ContainerDied","Data":"ee06a5387cdc7643c1c6dccdcdc3abfa75a0c835ee657ef9bb4f10aad7244e56"} Nov 22 04:27:22 crc kubenswrapper[4952]: I1122 04:27:22.113182 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rktgw" event={"ID":"4bfe9d56-246b-409d-9395-20d88c8274d8","Type":"ContainerStarted","Data":"4a0e40c0cd4423a28f671e7c758542c22c41e45f72b25c7ba96c8b4c203d98f4"} Nov 22 04:27:22 crc kubenswrapper[4952]: I1122 04:27:22.133479 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rktgw" podStartSLOduration=2.684398551 podStartE2EDuration="7.133458719s" podCreationTimestamp="2025-11-22 04:27:15 +0000 UTC" firstStartedPulling="2025-11-22 04:27:17.067646219 +0000 UTC m=+5601.373663492" lastFinishedPulling="2025-11-22 04:27:21.516706387 +0000 UTC m=+5605.822723660" observedRunningTime="2025-11-22 04:27:22.130025858 +0000 UTC m=+5606.436043141" watchObservedRunningTime="2025-11-22 04:27:22.133458719 +0000 UTC m=+5606.439475992" Nov 22 04:27:26 crc kubenswrapper[4952]: I1122 04:27:26.078830 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rktgw" Nov 22 04:27:26 crc kubenswrapper[4952]: I1122 04:27:26.079361 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rktgw" Nov 22 04:27:26 crc kubenswrapper[4952]: I1122 04:27:26.127383 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rktgw" Nov 22 04:27:26 crc kubenswrapper[4952]: I1122 04:27:26.192976 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rktgw" Nov 22 04:27:26 crc kubenswrapper[4952]: I1122 04:27:26.365189 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rktgw"] Nov 22 04:27:28 crc kubenswrapper[4952]: I1122 04:27:28.167321 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rktgw" podUID="4bfe9d56-246b-409d-9395-20d88c8274d8" containerName="registry-server" containerID="cri-o://4a0e40c0cd4423a28f671e7c758542c22c41e45f72b25c7ba96c8b4c203d98f4" gracePeriod=2 Nov 22 04:27:28 crc kubenswrapper[4952]: I1122 04:27:28.342079 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:27:28 crc kubenswrapper[4952]: I1122 04:27:28.342139 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.044220 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rktgw" Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.140737 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bfe9d56-246b-409d-9395-20d88c8274d8-utilities\") pod \"4bfe9d56-246b-409d-9395-20d88c8274d8\" (UID: \"4bfe9d56-246b-409d-9395-20d88c8274d8\") " Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.140884 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxc8h\" (UniqueName: \"kubernetes.io/projected/4bfe9d56-246b-409d-9395-20d88c8274d8-kube-api-access-mxc8h\") pod \"4bfe9d56-246b-409d-9395-20d88c8274d8\" (UID: \"4bfe9d56-246b-409d-9395-20d88c8274d8\") " Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.141030 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bfe9d56-246b-409d-9395-20d88c8274d8-catalog-content\") pod \"4bfe9d56-246b-409d-9395-20d88c8274d8\" (UID: \"4bfe9d56-246b-409d-9395-20d88c8274d8\") " Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.141733 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bfe9d56-246b-409d-9395-20d88c8274d8-utilities" (OuterVolumeSpecName: "utilities") pod "4bfe9d56-246b-409d-9395-20d88c8274d8" (UID: "4bfe9d56-246b-409d-9395-20d88c8274d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.141940 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bfe9d56-246b-409d-9395-20d88c8274d8-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.154045 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bfe9d56-246b-409d-9395-20d88c8274d8-kube-api-access-mxc8h" (OuterVolumeSpecName: "kube-api-access-mxc8h") pod "4bfe9d56-246b-409d-9395-20d88c8274d8" (UID: "4bfe9d56-246b-409d-9395-20d88c8274d8"). InnerVolumeSpecName "kube-api-access-mxc8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.188897 4952 generic.go:334] "Generic (PLEG): container finished" podID="4bfe9d56-246b-409d-9395-20d88c8274d8" containerID="4a0e40c0cd4423a28f671e7c758542c22c41e45f72b25c7ba96c8b4c203d98f4" exitCode=0 Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.188984 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rktgw" event={"ID":"4bfe9d56-246b-409d-9395-20d88c8274d8","Type":"ContainerDied","Data":"4a0e40c0cd4423a28f671e7c758542c22c41e45f72b25c7ba96c8b4c203d98f4"} Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.189024 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rktgw" event={"ID":"4bfe9d56-246b-409d-9395-20d88c8274d8","Type":"ContainerDied","Data":"b9e732241156b3e9afd869ab8a93a3747bbd201d1b163c3aaa61a7b1ae973248"} Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.189042 4952 scope.go:117] "RemoveContainer" containerID="4a0e40c0cd4423a28f671e7c758542c22c41e45f72b25c7ba96c8b4c203d98f4" Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.189306 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rktgw" Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.192282 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bfe9d56-246b-409d-9395-20d88c8274d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4bfe9d56-246b-409d-9395-20d88c8274d8" (UID: "4bfe9d56-246b-409d-9395-20d88c8274d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.231132 4952 scope.go:117] "RemoveContainer" containerID="ee06a5387cdc7643c1c6dccdcdc3abfa75a0c835ee657ef9bb4f10aad7244e56" Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.244220 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxc8h\" (UniqueName: \"kubernetes.io/projected/4bfe9d56-246b-409d-9395-20d88c8274d8-kube-api-access-mxc8h\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.244263 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bfe9d56-246b-409d-9395-20d88c8274d8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.253746 4952 scope.go:117] "RemoveContainer" containerID="0fe34b68c464cc3bc57a75c9f1e3c64ec82f341b1501a498288ba830f57359a0" Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.317168 4952 scope.go:117] "RemoveContainer" containerID="4a0e40c0cd4423a28f671e7c758542c22c41e45f72b25c7ba96c8b4c203d98f4" Nov 22 04:27:30 crc kubenswrapper[4952]: E1122 04:27:30.317985 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a0e40c0cd4423a28f671e7c758542c22c41e45f72b25c7ba96c8b4c203d98f4\": container with ID starting with 4a0e40c0cd4423a28f671e7c758542c22c41e45f72b25c7ba96c8b4c203d98f4 not found: ID does not exist" containerID="4a0e40c0cd4423a28f671e7c758542c22c41e45f72b25c7ba96c8b4c203d98f4" Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.318040 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a0e40c0cd4423a28f671e7c758542c22c41e45f72b25c7ba96c8b4c203d98f4"} err="failed to get container status \"4a0e40c0cd4423a28f671e7c758542c22c41e45f72b25c7ba96c8b4c203d98f4\": rpc error: code = NotFound desc = could not find container \"4a0e40c0cd4423a28f671e7c758542c22c41e45f72b25c7ba96c8b4c203d98f4\": container with ID starting with 4a0e40c0cd4423a28f671e7c758542c22c41e45f72b25c7ba96c8b4c203d98f4 not found: ID does not exist" Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.318072 4952 scope.go:117] "RemoveContainer" containerID="ee06a5387cdc7643c1c6dccdcdc3abfa75a0c835ee657ef9bb4f10aad7244e56" Nov 22 04:27:30 crc kubenswrapper[4952]: E1122 04:27:30.319392 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee06a5387cdc7643c1c6dccdcdc3abfa75a0c835ee657ef9bb4f10aad7244e56\": container with ID starting with ee06a5387cdc7643c1c6dccdcdc3abfa75a0c835ee657ef9bb4f10aad7244e56 not found: ID does not exist" containerID="ee06a5387cdc7643c1c6dccdcdc3abfa75a0c835ee657ef9bb4f10aad7244e56" Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.319503 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee06a5387cdc7643c1c6dccdcdc3abfa75a0c835ee657ef9bb4f10aad7244e56"} err="failed to get container status \"ee06a5387cdc7643c1c6dccdcdc3abfa75a0c835ee657ef9bb4f10aad7244e56\": rpc error: code = NotFound desc = could not find container \"ee06a5387cdc7643c1c6dccdcdc3abfa75a0c835ee657ef9bb4f10aad7244e56\": container with ID starting with ee06a5387cdc7643c1c6dccdcdc3abfa75a0c835ee657ef9bb4f10aad7244e56 not found: ID does not exist" Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.319620 4952 scope.go:117] "RemoveContainer" containerID="0fe34b68c464cc3bc57a75c9f1e3c64ec82f341b1501a498288ba830f57359a0" Nov 22 04:27:30 crc kubenswrapper[4952]: E1122 04:27:30.320118 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fe34b68c464cc3bc57a75c9f1e3c64ec82f341b1501a498288ba830f57359a0\": container with ID starting with 0fe34b68c464cc3bc57a75c9f1e3c64ec82f341b1501a498288ba830f57359a0 not found: ID does not exist" containerID="0fe34b68c464cc3bc57a75c9f1e3c64ec82f341b1501a498288ba830f57359a0" Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.320218 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fe34b68c464cc3bc57a75c9f1e3c64ec82f341b1501a498288ba830f57359a0"} err="failed to get container status \"0fe34b68c464cc3bc57a75c9f1e3c64ec82f341b1501a498288ba830f57359a0\": rpc error: code = NotFound desc = could not find container \"0fe34b68c464cc3bc57a75c9f1e3c64ec82f341b1501a498288ba830f57359a0\": container with ID starting with 0fe34b68c464cc3bc57a75c9f1e3c64ec82f341b1501a498288ba830f57359a0 not found: ID does not exist" Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.549049 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rktgw"] Nov 22 04:27:30 crc kubenswrapper[4952]: I1122 04:27:30.549121 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rktgw"] Nov 22 04:27:32 crc kubenswrapper[4952]: I1122 04:27:32.544349 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bfe9d56-246b-409d-9395-20d88c8274d8" path="/var/lib/kubelet/pods/4bfe9d56-246b-409d-9395-20d88c8274d8/volumes" Nov 22 04:27:34 crc kubenswrapper[4952]: I1122 04:27:34.695157 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-k75vw_c8b36a8f-760f-47c0-a090-c1f8c8ac44c5/control-plane-machine-set-operator/0.log" Nov 22 04:27:34 crc kubenswrapper[4952]: I1122 04:27:34.838473 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mz7ld_e043e178-e7e5-4ddf-b561-7253433d6e81/machine-api-operator/0.log" Nov 22 04:27:34 crc kubenswrapper[4952]: I1122 04:27:34.866561 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mz7ld_e043e178-e7e5-4ddf-b561-7253433d6e81/kube-rbac-proxy/0.log" Nov 22 04:27:49 crc kubenswrapper[4952]: I1122 04:27:49.158283 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-2zjtj_a6cc04eb-89a2-4ef1-a147-0be36ae00d02/cert-manager-controller/0.log" Nov 22 04:27:49 crc kubenswrapper[4952]: I1122 04:27:49.380239 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-tfj8s_66e3470d-6627-44f4-be2d-b7f64ef73e9b/cert-manager-cainjector/0.log" Nov 22 04:27:49 crc kubenswrapper[4952]: I1122 04:27:49.404918 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-m52fp_17db6d4f-216b-4307-9625-962bc50bc029/cert-manager-webhook/0.log" Nov 22 04:27:58 crc kubenswrapper[4952]: I1122 04:27:58.341521 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:27:58 crc kubenswrapper[4952]: I1122 04:27:58.341970 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:28:02 crc kubenswrapper[4952]: I1122 04:28:02.542695 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5874bd7bc5-gm8k5_e03eb50e-826c-40bd-9f6b-856c064dd96f/nmstate-console-plugin/0.log" Nov 22 04:28:02 crc kubenswrapper[4952]: I1122 04:28:02.720824 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-h7ck5_dccd7a53-7367-4e5a-9c27-0e38d8dce463/nmstate-handler/0.log" Nov 22 04:28:02 crc kubenswrapper[4952]: I1122 04:28:02.774409 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-lr7gp_d55bf0cd-e2d6-4eb3-94a9-3689ee1e2504/nmstate-metrics/0.log" Nov 22 04:28:02 crc kubenswrapper[4952]: I1122 04:28:02.781578 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-lr7gp_d55bf0cd-e2d6-4eb3-94a9-3689ee1e2504/kube-rbac-proxy/0.log" Nov 22 04:28:02 crc kubenswrapper[4952]: I1122 04:28:02.907986 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-557fdffb88-n49rs_fcbe36fe-f202-4a96-9016-b1f879fb5384/nmstate-operator/0.log" Nov 22 04:28:02 crc kubenswrapper[4952]: I1122 04:28:02.964016 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6b89b748d8-pp2cj_31032708-369f-4a6a-a5a8-99b4c72c38a1/nmstate-webhook/0.log" Nov 22 04:28:18 crc kubenswrapper[4952]: I1122 04:28:18.395188 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-nxf9p_97e744a4-812f-4662-bc3c-80777619b8a2/kube-rbac-proxy/0.log" Nov 22 04:28:18 crc kubenswrapper[4952]: I1122 04:28:18.433759 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-nxf9p_97e744a4-812f-4662-bc3c-80777619b8a2/controller/0.log" Nov 22 04:28:18 crc kubenswrapper[4952]: I1122 04:28:18.617462 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7rpg_59bb616f-6078-47be-a7d0-16749039f128/cp-frr-files/0.log" Nov 22 04:28:18 crc kubenswrapper[4952]: I1122 04:28:18.836227 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7rpg_59bb616f-6078-47be-a7d0-16749039f128/cp-reloader/0.log" Nov 22 04:28:18 crc kubenswrapper[4952]: I1122 04:28:18.847345 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7rpg_59bb616f-6078-47be-a7d0-16749039f128/cp-frr-files/0.log" Nov 22 04:28:18 crc kubenswrapper[4952]: I1122 04:28:18.849932 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7rpg_59bb616f-6078-47be-a7d0-16749039f128/cp-metrics/0.log" Nov 22 04:28:18 crc kubenswrapper[4952]: I1122 04:28:18.901837 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7rpg_59bb616f-6078-47be-a7d0-16749039f128/cp-reloader/0.log" Nov 22 04:28:19 crc kubenswrapper[4952]: I1122 04:28:19.056480 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7rpg_59bb616f-6078-47be-a7d0-16749039f128/cp-frr-files/0.log" Nov 22 04:28:19 crc kubenswrapper[4952]: I1122 04:28:19.072266 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7rpg_59bb616f-6078-47be-a7d0-16749039f128/cp-metrics/0.log" Nov 22 04:28:19 crc kubenswrapper[4952]: I1122 04:28:19.075606 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7rpg_59bb616f-6078-47be-a7d0-16749039f128/cp-reloader/0.log" Nov 22 04:28:19 crc kubenswrapper[4952]: I1122 04:28:19.103971 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7rpg_59bb616f-6078-47be-a7d0-16749039f128/cp-metrics/0.log" Nov 22 04:28:19 crc kubenswrapper[4952]: I1122 04:28:19.246672 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7rpg_59bb616f-6078-47be-a7d0-16749039f128/cp-reloader/0.log" Nov 22 04:28:19 crc kubenswrapper[4952]: I1122 04:28:19.279327 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7rpg_59bb616f-6078-47be-a7d0-16749039f128/cp-frr-files/0.log" Nov 22 04:28:19 crc kubenswrapper[4952]: I1122 04:28:19.309329 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7rpg_59bb616f-6078-47be-a7d0-16749039f128/cp-metrics/0.log" Nov 22 04:28:19 crc kubenswrapper[4952]: I1122 04:28:19.352390 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7rpg_59bb616f-6078-47be-a7d0-16749039f128/controller/0.log" Nov 22 04:28:19 crc kubenswrapper[4952]: I1122 04:28:19.506929 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7rpg_59bb616f-6078-47be-a7d0-16749039f128/frr-metrics/0.log" Nov 22 04:28:19 crc kubenswrapper[4952]: I1122 04:28:19.524957 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7rpg_59bb616f-6078-47be-a7d0-16749039f128/kube-rbac-proxy/0.log" Nov 22 04:28:19 crc kubenswrapper[4952]: I1122 04:28:19.663800 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7rpg_59bb616f-6078-47be-a7d0-16749039f128/kube-rbac-proxy-frr/0.log" Nov 22 04:28:19 crc kubenswrapper[4952]: I1122 04:28:19.751437 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7rpg_59bb616f-6078-47be-a7d0-16749039f128/reloader/0.log" Nov 22 04:28:19 crc kubenswrapper[4952]: I1122 04:28:19.894930 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-676cf_b1512c43-7cc0-4c7e-82f0-108811e38971/frr-k8s-webhook-server/0.log" Nov 22 04:28:20 crc kubenswrapper[4952]: I1122 04:28:20.005702 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6cdd766b96-rgt84_96df557a-4a41-48dc-bbea-961acc5fd4db/manager/0.log" Nov 22 04:28:20 crc kubenswrapper[4952]: I1122 04:28:20.181303 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-c64b8d58d-nfvjm_3bd69787-c3ae-450c-9854-1b7b9b54b379/webhook-server/0.log" Nov 22 04:28:20 crc kubenswrapper[4952]: I1122 04:28:20.326746 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-q9fcb_7b13d75a-b01b-4879-9ab2-1c5ffb445c38/kube-rbac-proxy/0.log" Nov 22 04:28:20 crc kubenswrapper[4952]: I1122 04:28:20.821648 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-q9fcb_7b13d75a-b01b-4879-9ab2-1c5ffb445c38/speaker/0.log" Nov 22 04:28:21 crc kubenswrapper[4952]: I1122 04:28:21.097958 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7rpg_59bb616f-6078-47be-a7d0-16749039f128/frr/0.log" Nov 22 04:28:28 crc kubenswrapper[4952]: I1122 04:28:28.341774 4952 patch_prober.go:28] interesting pod/machine-config-daemon-vn2dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:28:28 crc kubenswrapper[4952]: I1122 04:28:28.342308 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:28:28 crc kubenswrapper[4952]: I1122 04:28:28.342360 4952 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" Nov 22 04:28:28 crc kubenswrapper[4952]: I1122 04:28:28.343242 4952 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398"} pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:28:28 crc kubenswrapper[4952]: I1122 04:28:28.343317 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerName="machine-config-daemon" containerID="cri-o://ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" gracePeriod=600 Nov 22 04:28:28 crc kubenswrapper[4952]: E1122 04:28:28.504699 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:28:28 crc kubenswrapper[4952]: I1122 04:28:28.798897 4952 generic.go:334] "Generic (PLEG): container finished" podID="94f311d8-e9ac-4dd7-bc2c-321490681934" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" exitCode=0 Nov 22 04:28:28 crc kubenswrapper[4952]: I1122 04:28:28.798954 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerDied","Data":"ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398"} Nov 22 04:28:28 crc kubenswrapper[4952]: I1122 04:28:28.799005 4952 scope.go:117] "RemoveContainer" containerID="c89d348b574dc051cb51adc67db44a5be6af54aed9490de1b1cb04b258cd8eff" Nov 22 04:28:28 crc kubenswrapper[4952]: I1122 04:28:28.799770 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:28:28 crc kubenswrapper[4952]: E1122 04:28:28.800193 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:28:34 crc kubenswrapper[4952]: I1122 04:28:34.131662 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2_905bd3d3-e252-45e1-8d2d-287b04287e7d/util/0.log" Nov 22 04:28:34 crc kubenswrapper[4952]: I1122 04:28:34.285618 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2_905bd3d3-e252-45e1-8d2d-287b04287e7d/util/0.log" Nov 22 04:28:34 crc kubenswrapper[4952]: I1122 04:28:34.301209 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2_905bd3d3-e252-45e1-8d2d-287b04287e7d/pull/0.log" Nov 22 04:28:34 crc kubenswrapper[4952]: I1122 04:28:34.376764 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2_905bd3d3-e252-45e1-8d2d-287b04287e7d/pull/0.log" Nov 22 04:28:34 crc kubenswrapper[4952]: I1122 04:28:34.548854 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2_905bd3d3-e252-45e1-8d2d-287b04287e7d/util/0.log" Nov 22 04:28:34 crc kubenswrapper[4952]: I1122 04:28:34.549665 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2_905bd3d3-e252-45e1-8d2d-287b04287e7d/pull/0.log" Nov 22 04:28:34 crc kubenswrapper[4952]: I1122 04:28:34.590060 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e92gz2_905bd3d3-e252-45e1-8d2d-287b04287e7d/extract/0.log" Nov 22 04:28:34 crc kubenswrapper[4952]: I1122 04:28:34.719013 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d4n4w_8e4baf16-6aef-43f7-85c1-9426a926aae3/extract-utilities/0.log" Nov 22 04:28:34 crc kubenswrapper[4952]: I1122 04:28:34.858914 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d4n4w_8e4baf16-6aef-43f7-85c1-9426a926aae3/extract-content/0.log" Nov 22 04:28:34 crc kubenswrapper[4952]: I1122 04:28:34.921572 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d4n4w_8e4baf16-6aef-43f7-85c1-9426a926aae3/extract-utilities/0.log" Nov 22 04:28:34 crc kubenswrapper[4952]: I1122 04:28:34.960121 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d4n4w_8e4baf16-6aef-43f7-85c1-9426a926aae3/extract-content/0.log" Nov 22 04:28:35 crc kubenswrapper[4952]: I1122 04:28:35.112816 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d4n4w_8e4baf16-6aef-43f7-85c1-9426a926aae3/extract-content/0.log" Nov 22 04:28:35 crc kubenswrapper[4952]: I1122 04:28:35.134932 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d4n4w_8e4baf16-6aef-43f7-85c1-9426a926aae3/extract-utilities/0.log" Nov 22 04:28:35 crc kubenswrapper[4952]: I1122 04:28:35.316406 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nhtvn_cb648a4c-177a-43a5-a924-0e0d60ad85ae/extract-utilities/0.log" Nov 22 04:28:35 crc kubenswrapper[4952]: I1122 04:28:35.599033 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nhtvn_cb648a4c-177a-43a5-a924-0e0d60ad85ae/extract-content/0.log" Nov 22 04:28:35 crc kubenswrapper[4952]: I1122 04:28:35.612880 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nhtvn_cb648a4c-177a-43a5-a924-0e0d60ad85ae/extract-content/0.log" Nov 22 04:28:35 crc kubenswrapper[4952]: I1122 04:28:35.615124 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d4n4w_8e4baf16-6aef-43f7-85c1-9426a926aae3/registry-server/0.log" Nov 22 04:28:35 crc kubenswrapper[4952]: I1122 04:28:35.631736 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nhtvn_cb648a4c-177a-43a5-a924-0e0d60ad85ae/extract-utilities/0.log" Nov 22 04:28:35 crc kubenswrapper[4952]: I1122 04:28:35.758660 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nhtvn_cb648a4c-177a-43a5-a924-0e0d60ad85ae/extract-utilities/0.log" Nov 22 04:28:35 crc kubenswrapper[4952]: I1122 04:28:35.772002 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nhtvn_cb648a4c-177a-43a5-a924-0e0d60ad85ae/extract-content/0.log" Nov 22 04:28:35 crc kubenswrapper[4952]: I1122 04:28:35.966493 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2_59fd7e52-0ba7-42f3-b749-493cf9c5b8d2/util/0.log" Nov 22 04:28:36 crc kubenswrapper[4952]: I1122 04:28:36.195666 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2_59fd7e52-0ba7-42f3-b749-493cf9c5b8d2/pull/0.log" Nov 22 04:28:36 crc kubenswrapper[4952]: I1122 04:28:36.228349 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2_59fd7e52-0ba7-42f3-b749-493cf9c5b8d2/pull/0.log" Nov 22 04:28:36 crc kubenswrapper[4952]: I1122 04:28:36.245344 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2_59fd7e52-0ba7-42f3-b749-493cf9c5b8d2/util/0.log" Nov 22 04:28:36 crc kubenswrapper[4952]: I1122 04:28:36.479967 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2_59fd7e52-0ba7-42f3-b749-493cf9c5b8d2/util/0.log" Nov 22 04:28:36 crc kubenswrapper[4952]: I1122 04:28:36.500224 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2_59fd7e52-0ba7-42f3-b749-493cf9c5b8d2/pull/0.log" Nov 22 04:28:36 crc kubenswrapper[4952]: I1122 04:28:36.513110 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6plps2_59fd7e52-0ba7-42f3-b749-493cf9c5b8d2/extract/0.log" Nov 22 04:28:36 crc kubenswrapper[4952]: I1122 04:28:36.637165 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nhtvn_cb648a4c-177a-43a5-a924-0e0d60ad85ae/registry-server/0.log" Nov 22 04:28:36 crc kubenswrapper[4952]: I1122 04:28:36.785145 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-dg5wr_421c7496-b72b-4558-8064-39b4578d0cda/marketplace-operator/0.log" Nov 22 04:28:36 crc kubenswrapper[4952]: I1122 04:28:36.982448 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gfmzc_d5ef5937-ec8f-4c34-a7a7-9b49399ef460/extract-utilities/0.log" Nov 22 04:28:37 crc kubenswrapper[4952]: I1122 04:28:37.118210 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gfmzc_d5ef5937-ec8f-4c34-a7a7-9b49399ef460/extract-content/0.log" Nov 22 04:28:37 crc kubenswrapper[4952]: I1122 04:28:37.129797 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gfmzc_d5ef5937-ec8f-4c34-a7a7-9b49399ef460/extract-utilities/0.log" Nov 22 04:28:37 crc kubenswrapper[4952]: I1122 04:28:37.145321 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gfmzc_d5ef5937-ec8f-4c34-a7a7-9b49399ef460/extract-content/0.log" Nov 22 04:28:37 crc kubenswrapper[4952]: I1122 04:28:37.259049 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gfmzc_d5ef5937-ec8f-4c34-a7a7-9b49399ef460/extract-content/0.log" Nov 22 04:28:37 crc kubenswrapper[4952]: I1122 04:28:37.271845 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gfmzc_d5ef5937-ec8f-4c34-a7a7-9b49399ef460/extract-utilities/0.log" Nov 22 04:28:37 crc kubenswrapper[4952]: I1122 04:28:37.490876 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-t5rgp_7be8e929-2755-4dea-9693-37a6d1a099bb/extract-utilities/0.log" Nov 22 04:28:37 crc kubenswrapper[4952]: I1122 04:28:37.492474 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gfmzc_d5ef5937-ec8f-4c34-a7a7-9b49399ef460/registry-server/0.log" Nov 22 04:28:37 crc kubenswrapper[4952]: I1122 04:28:37.610248 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-t5rgp_7be8e929-2755-4dea-9693-37a6d1a099bb/extract-content/0.log" Nov 22 04:28:37 crc kubenswrapper[4952]: I1122 04:28:37.633113 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-t5rgp_7be8e929-2755-4dea-9693-37a6d1a099bb/extract-content/0.log" Nov 22 04:28:37 crc kubenswrapper[4952]: I1122 04:28:37.652025 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-t5rgp_7be8e929-2755-4dea-9693-37a6d1a099bb/extract-utilities/0.log" Nov 22 04:28:37 crc kubenswrapper[4952]: I1122 04:28:37.849210 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-t5rgp_7be8e929-2755-4dea-9693-37a6d1a099bb/extract-utilities/0.log" Nov 22 04:28:37 crc kubenswrapper[4952]: I1122 04:28:37.858106 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-t5rgp_7be8e929-2755-4dea-9693-37a6d1a099bb/extract-content/0.log" Nov 22 04:28:38 crc kubenswrapper[4952]: I1122 04:28:38.467415 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-t5rgp_7be8e929-2755-4dea-9693-37a6d1a099bb/registry-server/0.log" Nov 22 04:28:42 crc kubenswrapper[4952]: I1122 04:28:42.531706 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:28:42 crc kubenswrapper[4952]: E1122 04:28:42.533331 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:28:53 crc kubenswrapper[4952]: I1122 04:28:53.530741 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:28:53 crc kubenswrapper[4952]: E1122 04:28:53.531533 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:29:06 crc kubenswrapper[4952]: I1122 04:29:06.540221 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:29:06 crc kubenswrapper[4952]: E1122 04:29:06.541486 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:29:19 crc kubenswrapper[4952]: I1122 04:29:19.531330 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:29:19 crc kubenswrapper[4952]: E1122 04:29:19.533527 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:29:30 crc kubenswrapper[4952]: I1122 04:29:30.533462 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:29:30 crc kubenswrapper[4952]: E1122 04:29:30.534465 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:29:41 crc kubenswrapper[4952]: I1122 04:29:41.532234 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:29:41 crc kubenswrapper[4952]: E1122 04:29:41.533375 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:29:55 crc kubenswrapper[4952]: I1122 04:29:55.531581 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:29:55 crc kubenswrapper[4952]: E1122 04:29:55.533010 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:30:00 crc kubenswrapper[4952]: I1122 04:30:00.181936 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396430-4zql2"] Nov 22 04:30:00 crc kubenswrapper[4952]: E1122 04:30:00.183391 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bfe9d56-246b-409d-9395-20d88c8274d8" containerName="extract-utilities" Nov 22 04:30:00 crc kubenswrapper[4952]: I1122 04:30:00.183406 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bfe9d56-246b-409d-9395-20d88c8274d8" containerName="extract-utilities" Nov 22 04:30:00 crc kubenswrapper[4952]: E1122 04:30:00.183425 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bfe9d56-246b-409d-9395-20d88c8274d8" containerName="registry-server" Nov 22 04:30:00 crc kubenswrapper[4952]: I1122 04:30:00.183432 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bfe9d56-246b-409d-9395-20d88c8274d8" containerName="registry-server" Nov 22 04:30:00 crc kubenswrapper[4952]: E1122 04:30:00.183469 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bfe9d56-246b-409d-9395-20d88c8274d8" containerName="extract-content" Nov 22 04:30:00 crc kubenswrapper[4952]: I1122 04:30:00.183476 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bfe9d56-246b-409d-9395-20d88c8274d8" containerName="extract-content" Nov 22 04:30:00 crc kubenswrapper[4952]: I1122 04:30:00.183859 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bfe9d56-246b-409d-9395-20d88c8274d8" containerName="registry-server" Nov 22 04:30:00 crc kubenswrapper[4952]: I1122 04:30:00.184834 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-4zql2" Nov 22 04:30:00 crc kubenswrapper[4952]: I1122 04:30:00.194555 4952 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 04:30:00 crc kubenswrapper[4952]: I1122 04:30:00.194720 4952 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 04:30:00 crc kubenswrapper[4952]: I1122 04:30:00.202251 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396430-4zql2"] Nov 22 04:30:00 crc kubenswrapper[4952]: I1122 04:30:00.269235 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27679b78-a191-4337-915b-9709a05d710c-config-volume\") pod \"collect-profiles-29396430-4zql2\" (UID: \"27679b78-a191-4337-915b-9709a05d710c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-4zql2" Nov 22 04:30:00 crc kubenswrapper[4952]: I1122 04:30:00.269373 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24jdb\" (UniqueName: \"kubernetes.io/projected/27679b78-a191-4337-915b-9709a05d710c-kube-api-access-24jdb\") pod \"collect-profiles-29396430-4zql2\" (UID: \"27679b78-a191-4337-915b-9709a05d710c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-4zql2" Nov 22 04:30:00 crc kubenswrapper[4952]: I1122 04:30:00.269402 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27679b78-a191-4337-915b-9709a05d710c-secret-volume\") pod \"collect-profiles-29396430-4zql2\" (UID: \"27679b78-a191-4337-915b-9709a05d710c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-4zql2" Nov 22 04:30:00 crc kubenswrapper[4952]: I1122 04:30:00.371651 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27679b78-a191-4337-915b-9709a05d710c-config-volume\") pod \"collect-profiles-29396430-4zql2\" (UID: \"27679b78-a191-4337-915b-9709a05d710c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-4zql2" Nov 22 04:30:00 crc kubenswrapper[4952]: I1122 04:30:00.371801 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24jdb\" (UniqueName: \"kubernetes.io/projected/27679b78-a191-4337-915b-9709a05d710c-kube-api-access-24jdb\") pod \"collect-profiles-29396430-4zql2\" (UID: \"27679b78-a191-4337-915b-9709a05d710c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-4zql2" Nov 22 04:30:00 crc kubenswrapper[4952]: I1122 04:30:00.371835 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27679b78-a191-4337-915b-9709a05d710c-secret-volume\") pod \"collect-profiles-29396430-4zql2\" (UID: \"27679b78-a191-4337-915b-9709a05d710c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-4zql2" Nov 22 04:30:00 crc kubenswrapper[4952]: I1122 04:30:00.372899 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27679b78-a191-4337-915b-9709a05d710c-config-volume\") pod \"collect-profiles-29396430-4zql2\" (UID: \"27679b78-a191-4337-915b-9709a05d710c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-4zql2" Nov 22 04:30:00 crc kubenswrapper[4952]: I1122 04:30:00.380681 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27679b78-a191-4337-915b-9709a05d710c-secret-volume\") pod \"collect-profiles-29396430-4zql2\" (UID: \"27679b78-a191-4337-915b-9709a05d710c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-4zql2" Nov 22 04:30:00 crc kubenswrapper[4952]: I1122 04:30:00.403084 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24jdb\" (UniqueName: \"kubernetes.io/projected/27679b78-a191-4337-915b-9709a05d710c-kube-api-access-24jdb\") pod \"collect-profiles-29396430-4zql2\" (UID: \"27679b78-a191-4337-915b-9709a05d710c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-4zql2" Nov 22 04:30:00 crc kubenswrapper[4952]: I1122 04:30:00.509787 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-4zql2" Nov 22 04:30:01 crc kubenswrapper[4952]: I1122 04:30:01.037100 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396430-4zql2"] Nov 22 04:30:01 crc kubenswrapper[4952]: I1122 04:30:01.771752 4952 generic.go:334] "Generic (PLEG): container finished" podID="27679b78-a191-4337-915b-9709a05d710c" containerID="50f3120432dfcd4eccc254c4606c24f96c149f27237d423a08ecfa9693627573" exitCode=0 Nov 22 04:30:01 crc kubenswrapper[4952]: I1122 04:30:01.771836 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-4zql2" event={"ID":"27679b78-a191-4337-915b-9709a05d710c","Type":"ContainerDied","Data":"50f3120432dfcd4eccc254c4606c24f96c149f27237d423a08ecfa9693627573"} Nov 22 04:30:01 crc kubenswrapper[4952]: I1122 04:30:01.772321 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-4zql2" event={"ID":"27679b78-a191-4337-915b-9709a05d710c","Type":"ContainerStarted","Data":"012ae8c005f63f0fa469fbfbc24ae704412ff8c2775196de3625d7d560d93f94"} Nov 22 04:30:03 crc kubenswrapper[4952]: I1122 04:30:03.193716 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-4zql2" Nov 22 04:30:03 crc kubenswrapper[4952]: I1122 04:30:03.230310 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27679b78-a191-4337-915b-9709a05d710c-secret-volume\") pod \"27679b78-a191-4337-915b-9709a05d710c\" (UID: \"27679b78-a191-4337-915b-9709a05d710c\") " Nov 22 04:30:03 crc kubenswrapper[4952]: I1122 04:30:03.230404 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24jdb\" (UniqueName: \"kubernetes.io/projected/27679b78-a191-4337-915b-9709a05d710c-kube-api-access-24jdb\") pod \"27679b78-a191-4337-915b-9709a05d710c\" (UID: \"27679b78-a191-4337-915b-9709a05d710c\") " Nov 22 04:30:03 crc kubenswrapper[4952]: I1122 04:30:03.230539 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27679b78-a191-4337-915b-9709a05d710c-config-volume\") pod \"27679b78-a191-4337-915b-9709a05d710c\" (UID: \"27679b78-a191-4337-915b-9709a05d710c\") " Nov 22 04:30:03 crc kubenswrapper[4952]: I1122 04:30:03.234159 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27679b78-a191-4337-915b-9709a05d710c-config-volume" (OuterVolumeSpecName: "config-volume") pod "27679b78-a191-4337-915b-9709a05d710c" (UID: "27679b78-a191-4337-915b-9709a05d710c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:30:03 crc kubenswrapper[4952]: I1122 04:30:03.238792 4952 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27679b78-a191-4337-915b-9709a05d710c-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:03 crc kubenswrapper[4952]: I1122 04:30:03.241793 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27679b78-a191-4337-915b-9709a05d710c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "27679b78-a191-4337-915b-9709a05d710c" (UID: "27679b78-a191-4337-915b-9709a05d710c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:03 crc kubenswrapper[4952]: I1122 04:30:03.242005 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27679b78-a191-4337-915b-9709a05d710c-kube-api-access-24jdb" (OuterVolumeSpecName: "kube-api-access-24jdb") pod "27679b78-a191-4337-915b-9709a05d710c" (UID: "27679b78-a191-4337-915b-9709a05d710c"). InnerVolumeSpecName "kube-api-access-24jdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:30:03 crc kubenswrapper[4952]: I1122 04:30:03.339898 4952 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27679b78-a191-4337-915b-9709a05d710c-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:03 crc kubenswrapper[4952]: I1122 04:30:03.339932 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24jdb\" (UniqueName: \"kubernetes.io/projected/27679b78-a191-4337-915b-9709a05d710c-kube-api-access-24jdb\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:03 crc kubenswrapper[4952]: I1122 04:30:03.798047 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-4zql2" event={"ID":"27679b78-a191-4337-915b-9709a05d710c","Type":"ContainerDied","Data":"012ae8c005f63f0fa469fbfbc24ae704412ff8c2775196de3625d7d560d93f94"} Nov 22 04:30:03 crc kubenswrapper[4952]: I1122 04:30:03.798474 4952 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="012ae8c005f63f0fa469fbfbc24ae704412ff8c2775196de3625d7d560d93f94" Nov 22 04:30:03 crc kubenswrapper[4952]: I1122 04:30:03.798123 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-4zql2" Nov 22 04:30:04 crc kubenswrapper[4952]: I1122 04:30:04.280802 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw"] Nov 22 04:30:04 crc kubenswrapper[4952]: I1122 04:30:04.290748 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396385-r8zfw"] Nov 22 04:30:04 crc kubenswrapper[4952]: I1122 04:30:04.544869 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92ab05d2-2b72-4a8b-b649-6802aba4fc7d" path="/var/lib/kubelet/pods/92ab05d2-2b72-4a8b-b649-6802aba4fc7d/volumes" Nov 22 04:30:04 crc kubenswrapper[4952]: I1122 04:30:04.791308 4952 scope.go:117] "RemoveContainer" containerID="243943688a8d366a7d32df2c915a26d369adbcaaf6dabfbf476f6affc344daa1" Nov 22 04:30:06 crc kubenswrapper[4952]: I1122 04:30:06.547360 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:30:06 crc kubenswrapper[4952]: E1122 04:30:06.547985 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:30:17 crc kubenswrapper[4952]: I1122 04:30:17.531454 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:30:17 crc kubenswrapper[4952]: E1122 04:30:17.532497 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:30:32 crc kubenswrapper[4952]: I1122 04:30:32.532014 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:30:32 crc kubenswrapper[4952]: E1122 04:30:32.533310 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:30:34 crc kubenswrapper[4952]: I1122 04:30:34.734128 4952 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ba66b462-c52b-4474-80c9-670bf6be8870" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Nov 22 04:30:38 crc kubenswrapper[4952]: I1122 04:30:38.233741 4952 generic.go:334] "Generic (PLEG): container finished" podID="726bf479-478d-4230-965e-78041e86ad1f" containerID="6d65902ba02bef3280d837f104c790f7a1e8c1a43bf7bf0ae55fb3d54ffab97a" exitCode=0 Nov 22 04:30:38 crc kubenswrapper[4952]: I1122 04:30:38.233874 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjcqx/must-gather-pt4g2" event={"ID":"726bf479-478d-4230-965e-78041e86ad1f","Type":"ContainerDied","Data":"6d65902ba02bef3280d837f104c790f7a1e8c1a43bf7bf0ae55fb3d54ffab97a"} Nov 22 04:30:38 crc kubenswrapper[4952]: I1122 04:30:38.235930 4952 scope.go:117] "RemoveContainer" containerID="6d65902ba02bef3280d837f104c790f7a1e8c1a43bf7bf0ae55fb3d54ffab97a" Nov 22 04:30:38 crc kubenswrapper[4952]: I1122 04:30:38.563530 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jjcqx_must-gather-pt4g2_726bf479-478d-4230-965e-78041e86ad1f/gather/0.log" Nov 22 04:30:47 crc kubenswrapper[4952]: I1122 04:30:47.241944 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jjcqx/must-gather-pt4g2"] Nov 22 04:30:47 crc kubenswrapper[4952]: I1122 04:30:47.242867 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-jjcqx/must-gather-pt4g2" podUID="726bf479-478d-4230-965e-78041e86ad1f" containerName="copy" containerID="cri-o://1a864d4a9881e624d7b6df9a55388c1d8f0f11e4f03f12f7ecda308b03fde033" gracePeriod=2 Nov 22 04:30:47 crc kubenswrapper[4952]: I1122 04:30:47.258896 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jjcqx/must-gather-pt4g2"] Nov 22 04:30:47 crc kubenswrapper[4952]: I1122 04:30:47.366435 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jjcqx_must-gather-pt4g2_726bf479-478d-4230-965e-78041e86ad1f/copy/0.log" Nov 22 04:30:47 crc kubenswrapper[4952]: I1122 04:30:47.366836 4952 generic.go:334] "Generic (PLEG): container finished" podID="726bf479-478d-4230-965e-78041e86ad1f" containerID="1a864d4a9881e624d7b6df9a55388c1d8f0f11e4f03f12f7ecda308b03fde033" exitCode=143 Nov 22 04:30:47 crc kubenswrapper[4952]: I1122 04:30:47.532153 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:30:47 crc kubenswrapper[4952]: E1122 04:30:47.533009 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:30:47 crc kubenswrapper[4952]: I1122 04:30:47.816680 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jjcqx_must-gather-pt4g2_726bf479-478d-4230-965e-78041e86ad1f/copy/0.log" Nov 22 04:30:47 crc kubenswrapper[4952]: I1122 04:30:47.819003 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjcqx/must-gather-pt4g2" Nov 22 04:30:47 crc kubenswrapper[4952]: I1122 04:30:47.916587 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbcq9\" (UniqueName: \"kubernetes.io/projected/726bf479-478d-4230-965e-78041e86ad1f-kube-api-access-vbcq9\") pod \"726bf479-478d-4230-965e-78041e86ad1f\" (UID: \"726bf479-478d-4230-965e-78041e86ad1f\") " Nov 22 04:30:47 crc kubenswrapper[4952]: I1122 04:30:47.916668 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/726bf479-478d-4230-965e-78041e86ad1f-must-gather-output\") pod \"726bf479-478d-4230-965e-78041e86ad1f\" (UID: \"726bf479-478d-4230-965e-78041e86ad1f\") " Nov 22 04:30:48 crc kubenswrapper[4952]: I1122 04:30:48.070591 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/726bf479-478d-4230-965e-78041e86ad1f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "726bf479-478d-4230-965e-78041e86ad1f" (UID: "726bf479-478d-4230-965e-78041e86ad1f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:30:48 crc kubenswrapper[4952]: I1122 04:30:48.122025 4952 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/726bf479-478d-4230-965e-78041e86ad1f-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:48 crc kubenswrapper[4952]: I1122 04:30:48.363695 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/726bf479-478d-4230-965e-78041e86ad1f-kube-api-access-vbcq9" (OuterVolumeSpecName: "kube-api-access-vbcq9") pod "726bf479-478d-4230-965e-78041e86ad1f" (UID: "726bf479-478d-4230-965e-78041e86ad1f"). InnerVolumeSpecName "kube-api-access-vbcq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:30:48 crc kubenswrapper[4952]: I1122 04:30:48.384229 4952 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jjcqx_must-gather-pt4g2_726bf479-478d-4230-965e-78041e86ad1f/copy/0.log" Nov 22 04:30:48 crc kubenswrapper[4952]: I1122 04:30:48.384773 4952 scope.go:117] "RemoveContainer" containerID="1a864d4a9881e624d7b6df9a55388c1d8f0f11e4f03f12f7ecda308b03fde033" Nov 22 04:30:48 crc kubenswrapper[4952]: I1122 04:30:48.384926 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjcqx/must-gather-pt4g2" Nov 22 04:30:48 crc kubenswrapper[4952]: I1122 04:30:48.416050 4952 scope.go:117] "RemoveContainer" containerID="6d65902ba02bef3280d837f104c790f7a1e8c1a43bf7bf0ae55fb3d54ffab97a" Nov 22 04:30:48 crc kubenswrapper[4952]: I1122 04:30:48.428931 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbcq9\" (UniqueName: \"kubernetes.io/projected/726bf479-478d-4230-965e-78041e86ad1f-kube-api-access-vbcq9\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:48 crc kubenswrapper[4952]: I1122 04:30:48.548734 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="726bf479-478d-4230-965e-78041e86ad1f" path="/var/lib/kubelet/pods/726bf479-478d-4230-965e-78041e86ad1f/volumes" Nov 22 04:30:58 crc kubenswrapper[4952]: I1122 04:30:58.534064 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:30:58 crc kubenswrapper[4952]: E1122 04:30:58.534760 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:31:04 crc kubenswrapper[4952]: I1122 04:31:04.874066 4952 scope.go:117] "RemoveContainer" containerID="430287ee93ec7f5539503a592fe0449f32e3847b09aefd7c08fdfc9e7db94266" Nov 22 04:31:11 crc kubenswrapper[4952]: I1122 04:31:11.531015 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:31:11 crc kubenswrapper[4952]: E1122 04:31:11.531906 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:31:23 crc kubenswrapper[4952]: I1122 04:31:23.531235 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:31:23 crc kubenswrapper[4952]: E1122 04:31:23.532245 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:31:37 crc kubenswrapper[4952]: I1122 04:31:37.531471 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:31:37 crc kubenswrapper[4952]: E1122 04:31:37.532579 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:31:48 crc kubenswrapper[4952]: I1122 04:31:48.918131 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:31:48 crc kubenswrapper[4952]: E1122 04:31:48.920888 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:31:50 crc kubenswrapper[4952]: I1122 04:31:50.566432 4952 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f2q5b"] Nov 22 04:31:50 crc kubenswrapper[4952]: E1122 04:31:50.568208 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="726bf479-478d-4230-965e-78041e86ad1f" containerName="gather" Nov 22 04:31:50 crc kubenswrapper[4952]: I1122 04:31:50.568253 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="726bf479-478d-4230-965e-78041e86ad1f" containerName="gather" Nov 22 04:31:50 crc kubenswrapper[4952]: E1122 04:31:50.568277 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27679b78-a191-4337-915b-9709a05d710c" containerName="collect-profiles" Nov 22 04:31:50 crc kubenswrapper[4952]: I1122 04:31:50.568293 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="27679b78-a191-4337-915b-9709a05d710c" containerName="collect-profiles" Nov 22 04:31:50 crc kubenswrapper[4952]: E1122 04:31:50.568333 4952 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="726bf479-478d-4230-965e-78041e86ad1f" containerName="copy" Nov 22 04:31:50 crc kubenswrapper[4952]: I1122 04:31:50.568350 4952 state_mem.go:107] "Deleted CPUSet assignment" podUID="726bf479-478d-4230-965e-78041e86ad1f" containerName="copy" Nov 22 04:31:50 crc kubenswrapper[4952]: I1122 04:31:50.568855 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="726bf479-478d-4230-965e-78041e86ad1f" containerName="gather" Nov 22 04:31:50 crc kubenswrapper[4952]: I1122 04:31:50.568900 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="27679b78-a191-4337-915b-9709a05d710c" containerName="collect-profiles" Nov 22 04:31:50 crc kubenswrapper[4952]: I1122 04:31:50.568938 4952 memory_manager.go:354] "RemoveStaleState removing state" podUID="726bf479-478d-4230-965e-78041e86ad1f" containerName="copy" Nov 22 04:31:50 crc kubenswrapper[4952]: I1122 04:31:50.572778 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2q5b" Nov 22 04:31:50 crc kubenswrapper[4952]: I1122 04:31:50.581221 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2q5b"] Nov 22 04:31:50 crc kubenswrapper[4952]: I1122 04:31:50.745735 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2690964d-6776-4632-b6c2-faa655aa3c77-catalog-content\") pod \"redhat-marketplace-f2q5b\" (UID: \"2690964d-6776-4632-b6c2-faa655aa3c77\") " pod="openshift-marketplace/redhat-marketplace-f2q5b" Nov 22 04:31:50 crc kubenswrapper[4952]: I1122 04:31:50.745853 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2690964d-6776-4632-b6c2-faa655aa3c77-utilities\") pod \"redhat-marketplace-f2q5b\" (UID: \"2690964d-6776-4632-b6c2-faa655aa3c77\") " pod="openshift-marketplace/redhat-marketplace-f2q5b" Nov 22 04:31:50 crc kubenswrapper[4952]: I1122 04:31:50.746016 4952 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl72k\" (UniqueName: \"kubernetes.io/projected/2690964d-6776-4632-b6c2-faa655aa3c77-kube-api-access-bl72k\") pod \"redhat-marketplace-f2q5b\" (UID: \"2690964d-6776-4632-b6c2-faa655aa3c77\") " pod="openshift-marketplace/redhat-marketplace-f2q5b" Nov 22 04:31:50 crc kubenswrapper[4952]: I1122 04:31:50.847906 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2690964d-6776-4632-b6c2-faa655aa3c77-utilities\") pod \"redhat-marketplace-f2q5b\" (UID: \"2690964d-6776-4632-b6c2-faa655aa3c77\") " pod="openshift-marketplace/redhat-marketplace-f2q5b" Nov 22 04:31:50 crc kubenswrapper[4952]: I1122 04:31:50.848125 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl72k\" (UniqueName: \"kubernetes.io/projected/2690964d-6776-4632-b6c2-faa655aa3c77-kube-api-access-bl72k\") pod \"redhat-marketplace-f2q5b\" (UID: \"2690964d-6776-4632-b6c2-faa655aa3c77\") " pod="openshift-marketplace/redhat-marketplace-f2q5b" Nov 22 04:31:50 crc kubenswrapper[4952]: I1122 04:31:50.848154 4952 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2690964d-6776-4632-b6c2-faa655aa3c77-catalog-content\") pod \"redhat-marketplace-f2q5b\" (UID: \"2690964d-6776-4632-b6c2-faa655aa3c77\") " pod="openshift-marketplace/redhat-marketplace-f2q5b" Nov 22 04:31:50 crc kubenswrapper[4952]: I1122 04:31:50.848762 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2690964d-6776-4632-b6c2-faa655aa3c77-catalog-content\") pod \"redhat-marketplace-f2q5b\" (UID: \"2690964d-6776-4632-b6c2-faa655aa3c77\") " pod="openshift-marketplace/redhat-marketplace-f2q5b" Nov 22 04:31:50 crc kubenswrapper[4952]: I1122 04:31:50.849043 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2690964d-6776-4632-b6c2-faa655aa3c77-utilities\") pod \"redhat-marketplace-f2q5b\" (UID: \"2690964d-6776-4632-b6c2-faa655aa3c77\") " pod="openshift-marketplace/redhat-marketplace-f2q5b" Nov 22 04:31:50 crc kubenswrapper[4952]: I1122 04:31:50.882171 4952 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl72k\" (UniqueName: \"kubernetes.io/projected/2690964d-6776-4632-b6c2-faa655aa3c77-kube-api-access-bl72k\") pod \"redhat-marketplace-f2q5b\" (UID: \"2690964d-6776-4632-b6c2-faa655aa3c77\") " pod="openshift-marketplace/redhat-marketplace-f2q5b" Nov 22 04:31:50 crc kubenswrapper[4952]: I1122 04:31:50.907653 4952 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2q5b" Nov 22 04:31:51 crc kubenswrapper[4952]: I1122 04:31:51.430822 4952 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2q5b"] Nov 22 04:31:52 crc kubenswrapper[4952]: I1122 04:31:52.092379 4952 generic.go:334] "Generic (PLEG): container finished" podID="2690964d-6776-4632-b6c2-faa655aa3c77" containerID="c581cfc9717d0af2d136ed52a3159a20834174aee2a65e44e52e8b9ef6ab0e0a" exitCode=0 Nov 22 04:31:52 crc kubenswrapper[4952]: I1122 04:31:52.092454 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2q5b" event={"ID":"2690964d-6776-4632-b6c2-faa655aa3c77","Type":"ContainerDied","Data":"c581cfc9717d0af2d136ed52a3159a20834174aee2a65e44e52e8b9ef6ab0e0a"} Nov 22 04:31:52 crc kubenswrapper[4952]: I1122 04:31:52.092831 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2q5b" event={"ID":"2690964d-6776-4632-b6c2-faa655aa3c77","Type":"ContainerStarted","Data":"fa54429208e8aef736ed5f703f2b0cb1835aafc36c03c981a5302ccead1f73bd"} Nov 22 04:31:52 crc kubenswrapper[4952]: I1122 04:31:52.095122 4952 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:31:53 crc kubenswrapper[4952]: I1122 04:31:53.109719 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2q5b" event={"ID":"2690964d-6776-4632-b6c2-faa655aa3c77","Type":"ContainerStarted","Data":"7419ed6e32fef2dffa8eb8fd8effd933feb4bba912aa7098c109db1519cf6db4"} Nov 22 04:31:54 crc kubenswrapper[4952]: I1122 04:31:54.125312 4952 generic.go:334] "Generic (PLEG): container finished" podID="2690964d-6776-4632-b6c2-faa655aa3c77" containerID="7419ed6e32fef2dffa8eb8fd8effd933feb4bba912aa7098c109db1519cf6db4" exitCode=0 Nov 22 04:31:54 crc kubenswrapper[4952]: I1122 04:31:54.125389 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2q5b" event={"ID":"2690964d-6776-4632-b6c2-faa655aa3c77","Type":"ContainerDied","Data":"7419ed6e32fef2dffa8eb8fd8effd933feb4bba912aa7098c109db1519cf6db4"} Nov 22 04:31:55 crc kubenswrapper[4952]: I1122 04:31:55.141463 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2q5b" event={"ID":"2690964d-6776-4632-b6c2-faa655aa3c77","Type":"ContainerStarted","Data":"ae6617aff71f6a4f5f45c277a38ca1d669d318db629fdd32504f7cdd10511af9"} Nov 22 04:31:55 crc kubenswrapper[4952]: I1122 04:31:55.169329 4952 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f2q5b" podStartSLOduration=2.637581106 podStartE2EDuration="5.169302146s" podCreationTimestamp="2025-11-22 04:31:50 +0000 UTC" firstStartedPulling="2025-11-22 04:31:52.094897206 +0000 UTC m=+5876.400914479" lastFinishedPulling="2025-11-22 04:31:54.626618206 +0000 UTC m=+5878.932635519" observedRunningTime="2025-11-22 04:31:55.169201873 +0000 UTC m=+5879.475219216" watchObservedRunningTime="2025-11-22 04:31:55.169302146 +0000 UTC m=+5879.475319459" Nov 22 04:32:00 crc kubenswrapper[4952]: I1122 04:32:00.908165 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f2q5b" Nov 22 04:32:00 crc kubenswrapper[4952]: I1122 04:32:00.909061 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f2q5b" Nov 22 04:32:00 crc kubenswrapper[4952]: I1122 04:32:00.975376 4952 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f2q5b" Nov 22 04:32:01 crc kubenswrapper[4952]: I1122 04:32:01.263137 4952 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f2q5b" Nov 22 04:32:01 crc kubenswrapper[4952]: I1122 04:32:01.336603 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2q5b"] Nov 22 04:32:01 crc kubenswrapper[4952]: I1122 04:32:01.532048 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:32:01 crc kubenswrapper[4952]: E1122 04:32:01.532405 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:32:03 crc kubenswrapper[4952]: I1122 04:32:03.231608 4952 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f2q5b" podUID="2690964d-6776-4632-b6c2-faa655aa3c77" containerName="registry-server" containerID="cri-o://ae6617aff71f6a4f5f45c277a38ca1d669d318db629fdd32504f7cdd10511af9" gracePeriod=2 Nov 22 04:32:03 crc kubenswrapper[4952]: I1122 04:32:03.837454 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2q5b" Nov 22 04:32:03 crc kubenswrapper[4952]: I1122 04:32:03.964789 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl72k\" (UniqueName: \"kubernetes.io/projected/2690964d-6776-4632-b6c2-faa655aa3c77-kube-api-access-bl72k\") pod \"2690964d-6776-4632-b6c2-faa655aa3c77\" (UID: \"2690964d-6776-4632-b6c2-faa655aa3c77\") " Nov 22 04:32:03 crc kubenswrapper[4952]: I1122 04:32:03.964881 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2690964d-6776-4632-b6c2-faa655aa3c77-utilities\") pod \"2690964d-6776-4632-b6c2-faa655aa3c77\" (UID: \"2690964d-6776-4632-b6c2-faa655aa3c77\") " Nov 22 04:32:03 crc kubenswrapper[4952]: I1122 04:32:03.965009 4952 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2690964d-6776-4632-b6c2-faa655aa3c77-catalog-content\") pod \"2690964d-6776-4632-b6c2-faa655aa3c77\" (UID: \"2690964d-6776-4632-b6c2-faa655aa3c77\") " Nov 22 04:32:03 crc kubenswrapper[4952]: I1122 04:32:03.967027 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2690964d-6776-4632-b6c2-faa655aa3c77-utilities" (OuterVolumeSpecName: "utilities") pod "2690964d-6776-4632-b6c2-faa655aa3c77" (UID: "2690964d-6776-4632-b6c2-faa655aa3c77"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:32:03 crc kubenswrapper[4952]: I1122 04:32:03.976153 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2690964d-6776-4632-b6c2-faa655aa3c77-kube-api-access-bl72k" (OuterVolumeSpecName: "kube-api-access-bl72k") pod "2690964d-6776-4632-b6c2-faa655aa3c77" (UID: "2690964d-6776-4632-b6c2-faa655aa3c77"). InnerVolumeSpecName "kube-api-access-bl72k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:32:03 crc kubenswrapper[4952]: I1122 04:32:03.988313 4952 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2690964d-6776-4632-b6c2-faa655aa3c77-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2690964d-6776-4632-b6c2-faa655aa3c77" (UID: "2690964d-6776-4632-b6c2-faa655aa3c77"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:32:04 crc kubenswrapper[4952]: I1122 04:32:04.067913 4952 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl72k\" (UniqueName: \"kubernetes.io/projected/2690964d-6776-4632-b6c2-faa655aa3c77-kube-api-access-bl72k\") on node \"crc\" DevicePath \"\"" Nov 22 04:32:04 crc kubenswrapper[4952]: I1122 04:32:04.067949 4952 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2690964d-6776-4632-b6c2-faa655aa3c77-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:32:04 crc kubenswrapper[4952]: I1122 04:32:04.067961 4952 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2690964d-6776-4632-b6c2-faa655aa3c77-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:32:04 crc kubenswrapper[4952]: I1122 04:32:04.245950 4952 generic.go:334] "Generic (PLEG): container finished" podID="2690964d-6776-4632-b6c2-faa655aa3c77" containerID="ae6617aff71f6a4f5f45c277a38ca1d669d318db629fdd32504f7cdd10511af9" exitCode=0 Nov 22 04:32:04 crc kubenswrapper[4952]: I1122 04:32:04.246001 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2q5b" event={"ID":"2690964d-6776-4632-b6c2-faa655aa3c77","Type":"ContainerDied","Data":"ae6617aff71f6a4f5f45c277a38ca1d669d318db629fdd32504f7cdd10511af9"} Nov 22 04:32:04 crc kubenswrapper[4952]: I1122 04:32:04.246033 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2q5b" event={"ID":"2690964d-6776-4632-b6c2-faa655aa3c77","Type":"ContainerDied","Data":"fa54429208e8aef736ed5f703f2b0cb1835aafc36c03c981a5302ccead1f73bd"} Nov 22 04:32:04 crc kubenswrapper[4952]: I1122 04:32:04.246055 4952 scope.go:117] "RemoveContainer" containerID="ae6617aff71f6a4f5f45c277a38ca1d669d318db629fdd32504f7cdd10511af9" Nov 22 04:32:04 crc kubenswrapper[4952]: I1122 04:32:04.246110 4952 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2q5b" Nov 22 04:32:04 crc kubenswrapper[4952]: I1122 04:32:04.285485 4952 scope.go:117] "RemoveContainer" containerID="7419ed6e32fef2dffa8eb8fd8effd933feb4bba912aa7098c109db1519cf6db4" Nov 22 04:32:04 crc kubenswrapper[4952]: I1122 04:32:04.291681 4952 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2q5b"] Nov 22 04:32:04 crc kubenswrapper[4952]: I1122 04:32:04.304167 4952 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2q5b"] Nov 22 04:32:04 crc kubenswrapper[4952]: I1122 04:32:04.326370 4952 scope.go:117] "RemoveContainer" containerID="c581cfc9717d0af2d136ed52a3159a20834174aee2a65e44e52e8b9ef6ab0e0a" Nov 22 04:32:04 crc kubenswrapper[4952]: I1122 04:32:04.373736 4952 scope.go:117] "RemoveContainer" containerID="ae6617aff71f6a4f5f45c277a38ca1d669d318db629fdd32504f7cdd10511af9" Nov 22 04:32:04 crc kubenswrapper[4952]: E1122 04:32:04.374310 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae6617aff71f6a4f5f45c277a38ca1d669d318db629fdd32504f7cdd10511af9\": container with ID starting with ae6617aff71f6a4f5f45c277a38ca1d669d318db629fdd32504f7cdd10511af9 not found: ID does not exist" containerID="ae6617aff71f6a4f5f45c277a38ca1d669d318db629fdd32504f7cdd10511af9" Nov 22 04:32:04 crc kubenswrapper[4952]: I1122 04:32:04.374338 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae6617aff71f6a4f5f45c277a38ca1d669d318db629fdd32504f7cdd10511af9"} err="failed to get container status \"ae6617aff71f6a4f5f45c277a38ca1d669d318db629fdd32504f7cdd10511af9\": rpc error: code = NotFound desc = could not find container \"ae6617aff71f6a4f5f45c277a38ca1d669d318db629fdd32504f7cdd10511af9\": container with ID starting with ae6617aff71f6a4f5f45c277a38ca1d669d318db629fdd32504f7cdd10511af9 not found: ID does not exist" Nov 22 04:32:04 crc kubenswrapper[4952]: I1122 04:32:04.374356 4952 scope.go:117] "RemoveContainer" containerID="7419ed6e32fef2dffa8eb8fd8effd933feb4bba912aa7098c109db1519cf6db4" Nov 22 04:32:04 crc kubenswrapper[4952]: E1122 04:32:04.374963 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7419ed6e32fef2dffa8eb8fd8effd933feb4bba912aa7098c109db1519cf6db4\": container with ID starting with 7419ed6e32fef2dffa8eb8fd8effd933feb4bba912aa7098c109db1519cf6db4 not found: ID does not exist" containerID="7419ed6e32fef2dffa8eb8fd8effd933feb4bba912aa7098c109db1519cf6db4" Nov 22 04:32:04 crc kubenswrapper[4952]: I1122 04:32:04.375034 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7419ed6e32fef2dffa8eb8fd8effd933feb4bba912aa7098c109db1519cf6db4"} err="failed to get container status \"7419ed6e32fef2dffa8eb8fd8effd933feb4bba912aa7098c109db1519cf6db4\": rpc error: code = NotFound desc = could not find container \"7419ed6e32fef2dffa8eb8fd8effd933feb4bba912aa7098c109db1519cf6db4\": container with ID starting with 7419ed6e32fef2dffa8eb8fd8effd933feb4bba912aa7098c109db1519cf6db4 not found: ID does not exist" Nov 22 04:32:04 crc kubenswrapper[4952]: I1122 04:32:04.375081 4952 scope.go:117] "RemoveContainer" containerID="c581cfc9717d0af2d136ed52a3159a20834174aee2a65e44e52e8b9ef6ab0e0a" Nov 22 04:32:04 crc kubenswrapper[4952]: E1122 04:32:04.375512 4952 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c581cfc9717d0af2d136ed52a3159a20834174aee2a65e44e52e8b9ef6ab0e0a\": container with ID starting with c581cfc9717d0af2d136ed52a3159a20834174aee2a65e44e52e8b9ef6ab0e0a not found: ID does not exist" containerID="c581cfc9717d0af2d136ed52a3159a20834174aee2a65e44e52e8b9ef6ab0e0a" Nov 22 04:32:04 crc kubenswrapper[4952]: I1122 04:32:04.375584 4952 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c581cfc9717d0af2d136ed52a3159a20834174aee2a65e44e52e8b9ef6ab0e0a"} err="failed to get container status \"c581cfc9717d0af2d136ed52a3159a20834174aee2a65e44e52e8b9ef6ab0e0a\": rpc error: code = NotFound desc = could not find container \"c581cfc9717d0af2d136ed52a3159a20834174aee2a65e44e52e8b9ef6ab0e0a\": container with ID starting with c581cfc9717d0af2d136ed52a3159a20834174aee2a65e44e52e8b9ef6ab0e0a not found: ID does not exist" Nov 22 04:32:04 crc kubenswrapper[4952]: I1122 04:32:04.540537 4952 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2690964d-6776-4632-b6c2-faa655aa3c77" path="/var/lib/kubelet/pods/2690964d-6776-4632-b6c2-faa655aa3c77/volumes" Nov 22 04:32:14 crc kubenswrapper[4952]: I1122 04:32:14.532139 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:32:14 crc kubenswrapper[4952]: E1122 04:32:14.533293 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:32:25 crc kubenswrapper[4952]: I1122 04:32:25.531409 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:32:25 crc kubenswrapper[4952]: E1122 04:32:25.532345 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:32:39 crc kubenswrapper[4952]: I1122 04:32:39.531434 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:32:39 crc kubenswrapper[4952]: E1122 04:32:39.533006 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:32:51 crc kubenswrapper[4952]: I1122 04:32:51.531969 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:32:51 crc kubenswrapper[4952]: E1122 04:32:51.533023 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:33:04 crc kubenswrapper[4952]: I1122 04:33:04.532331 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:33:04 crc kubenswrapper[4952]: E1122 04:33:04.533458 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:33:05 crc kubenswrapper[4952]: I1122 04:33:05.020109 4952 scope.go:117] "RemoveContainer" containerID="66f460845b5f57c5d80cb2410530436869e6e9bd6ab4f5906729159363858484" Nov 22 04:33:19 crc kubenswrapper[4952]: I1122 04:33:19.533285 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:33:19 crc kubenswrapper[4952]: E1122 04:33:19.534357 4952 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn2dl_openshift-machine-config-operator(94f311d8-e9ac-4dd7-bc2c-321490681934)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" podUID="94f311d8-e9ac-4dd7-bc2c-321490681934" Nov 22 04:33:34 crc kubenswrapper[4952]: I1122 04:33:34.532107 4952 scope.go:117] "RemoveContainer" containerID="ad9d9cb96e1f317a6208c588bc4ee81188dcface0ec570bdd745fcaeff952398" Nov 22 04:33:35 crc kubenswrapper[4952]: I1122 04:33:35.319793 4952 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn2dl" event={"ID":"94f311d8-e9ac-4dd7-bc2c-321490681934","Type":"ContainerStarted","Data":"a4cefa06f87b1f7db53141db6f0f2ba99b41b4b3b77ff64ce32cfca271178c23"}